rhai wrapper ai gen wip

This commit is contained in:
Timur Gordon
2025-04-13 00:35:29 +02:00
parent e5de293919
commit 4b39b137de
19 changed files with 3491 additions and 0 deletions

View File

@@ -0,0 +1,123 @@
# Escalayer
Escalayer is a module for executing AI tasks with automatic escalation to more powerful models when needed. It provides a framework for creating complex AI workflows by breaking them down into sequential unit tasks.
## Overview
Escalayer allows you to:
1. Create complex AI tasks composed of multiple sequential unit tasks
2. Execute each unit task with a cheap AI model first
3. Automatically retry with a more powerful model if the task fails
4. Process and validate AI responses with custom callback functions
## Architecture
The module is organized into the following components:
- **Task**: Represents a complete AI task composed of multiple sequential unit tasks
- **UnitTask**: Represents a single step in the task with prompt generation and response processing
- **ModelConfig**: Defines the configuration for an AI model
- **OpenRouter Integration**: Uses OpenRouter to access a wide range of AI models
## Usage
### Basic Example
```v
import freeflowuniverse.herolib.mcp.aitools.escalayer
fn main() {
// Create a new task
mut task := escalayer.new_task(
name: 'rhai_wrapper_creator'
description: 'Create Rhai wrappers for Rust functions'
)
// Define the unit tasks
task.new_unit_task(
name: 'separate_functions'
prompt_function: separate_functions
callback_function: process_functions
)
// Initiate the task
result := task.initiate('path/to/rust/file.rs') or {
println('Task failed: ${err}')
return
}
println('Task completed successfully')
println(result)
}
// Define the prompt function
fn separate_functions(input string) string {
return 'Read rust file and separate it into functions ${input}'
}
// Define the callback function
fn process_functions(response string)! string {
// Process the AI response
// Return error if processing fails
if response.contains('error') {
return error('Failed to process functions: Invalid response format')
}
return response
}
```
### Advanced Configuration
You can configure each unit task with different models, retry counts, and other parameters:
```v
// Configure with custom parameters
task.new_unit_task(
name: 'create_wrappers'
prompt_function: create_wrappers
callback_function: process_wrappers
retry_count: 2
base_model: escalayer.ModelConfig{
name: 'claude-3-haiku-20240307'
provider: 'anthropic'
temperature: 0.5
max_tokens: 4000
}
)
```
## How It Works
1. When you call `task.initiate(input)`, the first unit task is executed with its prompt function.
2. The prompt is sent to the base AI model.
3. The response is processed by the callback function.
4. If the callback returns an error, the task is retried with the same model.
5. After a specified number of retries, the task escalates to a more powerful model.
6. Once a unit task succeeds, its result is passed as input to the next unit task.
7. This process continues until all unit tasks are completed.
## Environment Setup
Escalayer uses OpenRouter for AI model access. Set the following environment variable:
```
OPENROUTER_API_KEY=your_api_key_here
```
You can get an API key from [OpenRouter](https://openrouter.ai/).
## Original Requirements
This module was designed based on the following requirements:
- Create a system for executing AI tasks with a retry mechanism
- Escalate to more powerful models if cheaper models fail
- Use OpenAI client over OpenRouter for AI calls
- Break down complex tasks into sequential unit tasks
- Each unit task has a function that generates a prompt and a callback that processes the response
- Retry if the callback returns an error, with the error message prepended to the input string
For a detailed architecture overview, see [escalayer_architecture.md](./escalayer_architecture.md).
For a complete example, see [example.v](./example.v).

View File

@@ -0,0 +1,40 @@
module escalayer
import freeflowuniverse.herolib.clients.openai
// TaskParams defines the parameters for creating a new task
@[params]
pub struct TaskParams {
pub:
name string
description string
}
// Create a new task
pub fn new_task(params TaskParams) &Task {
return &Task{
name: params.name
description: params.description
unit_tasks: []
current_result: ''
}
}
// Default model configurations
pub fn default_base_model() ModelConfig {
return ModelConfig{
name: 'qwen2.5-7b-instruct'
provider: 'openai'
temperature: 0.7
max_tokens: 2000
}
}
pub fn default_retry_model() ModelConfig {
return ModelConfig{
name: 'gpt-4'
provider: 'openai'
temperature: 0.7
max_tokens: 4000
}
}

View File

@@ -0,0 +1,448 @@
# Escalayer Architecture
This document outlines the architecture for the Escalayer module, which provides a framework for executing AI tasks with automatic escalation to more powerful models when needed.
## 1. Module Structure
```
lib/mcp/aitools/escalayer/
├── escalayer.v # Main module file with public API
├── task.v # Task implementation
├── unit_task.v # Unit task implementation
├── models.v # Model definitions and configurations
├── openrouter.v # OpenRouter API client
└── README.md # Documentation
```
## 2. Core Components
### 2.1 Data Structures
```mermaid
classDiagram
class Task {
+string name
+string description
+[]UnitTask unit_tasks
+string current_result
+new_unit_task(params UnitTaskParams) UnitTask
+initiate(input string)! string
}
class UnitTask {
+string name
+Function prompt_function
+Function callback_function
+ModelConfig base_model
+ModelConfig retry_model
+int retry_count
+execute(input string)! string
}
class ModelConfig {
+string name
+string provider
+float temperature
+int max_tokens
}
Task "1" *-- "many" UnitTask : contains
UnitTask "1" *-- "1" ModelConfig : base_model
UnitTask "1" *-- "1" ModelConfig : retry_model
```
### 2.2 Component Descriptions
#### Task
- Represents a complete AI task composed of multiple sequential unit tasks
- Manages the flow of data between unit tasks
- Tracks overall task progress and results
#### UnitTask
- Represents a single step in the task
- Contains a prompt function that generates the AI prompt
- Contains a callback function that processes the AI response
- Manages retries and model escalation
#### ModelConfig
- Defines the configuration for an AI model
- Includes model name, provider, and parameters like temperature and max tokens
#### OpenRouter Client
- Handles communication with the OpenRouter API
- Sends prompts to AI models and receives responses
## 3. Implementation Details
### 3.1 escalayer.v (Main Module)
```v
module escalayer
import freeflowuniverse.herolib.clients.openai
// TaskParams defines the parameters for creating a new task
@[params]
pub struct TaskParams {
pub:
name string
description string
}
// Create a new task
pub fn new_task(params TaskParams) &Task {
return &Task{
name: params.name
description: params.description
unit_tasks: []
current_result: ''
}
}
// Default model configurations
pub fn default_base_model() ModelConfig {
return ModelConfig{
name: 'gpt-3.5-turbo'
provider: 'openai'
temperature: 0.7
max_tokens: 2000
}
}
pub fn default_retry_model() ModelConfig {
return ModelConfig{
name: 'gpt-4'
provider: 'openai'
temperature: 0.7
max_tokens: 4000
}
}
```
### 3.2 task.v
```v
module escalayer
// Task represents a complete AI task composed of multiple sequential unit tasks
pub struct Task {
pub mut:
name string
description string
unit_tasks []UnitTask
current_result string
}
// UnitTaskParams defines the parameters for creating a new unit task
@[params]
pub struct UnitTaskParams {
pub:
name string
prompt_function fn(string) string
callback_function fn(string)! string
base_model ?ModelConfig
retry_model ?ModelConfig
retry_count ?int
}
// Add a new unit task to the task
pub fn (mut t Task) new_unit_task(params UnitTaskParams) &UnitTask {
mut unit_task := UnitTask{
name: params.name
prompt_function: params.prompt_function
callback_function: params.callback_function
base_model: if params.base_model != none { params.base_model? } else { default_base_model() }
retry_model: if params.retry_model != none { params.retry_model? } else { default_retry_model() }
retry_count: if params.retry_count != none { params.retry_count? } else { 3 }
}
t.unit_tasks << unit_task
return &t.unit_tasks[t.unit_tasks.len - 1]
}
// Initiate the task execution
pub fn (mut t Task) initiate(input string)! string {
mut current_input := input
for i, mut unit_task in t.unit_tasks {
println('Executing unit task ${i+1}/${t.unit_tasks.len}: ${unit_task.name}')
// Execute the unit task with the current input
result := unit_task.execute(current_input)!
// Update the current input for the next unit task
current_input = result
t.current_result = result
}
return t.current_result
}
```
### 3.3 unit_task.v
```v
module escalayer
import freeflowuniverse.herolib.clients.openai
// UnitTask represents a single step in the task
pub struct UnitTask {
pub mut:
name string
prompt_function fn(string) string
callback_function fn(string)! string
base_model ModelConfig
retry_model ModelConfig
retry_count int
}
// Execute the unit task
pub fn (mut ut UnitTask) execute(input string)! string {
// Generate the prompt using the prompt function
prompt := ut.prompt_function(input)
// Try with the base model first
mut current_model := ut.base_model
mut attempts := 0
mut max_attempts := ut.retry_count + 1 // +1 for the initial attempt
mut last_error := ''
for attempts < max_attempts {
attempts++
// If we've exhausted retries with the base model, switch to the retry model
if attempts > ut.retry_count {
println('Escalating to more powerful model: ${ut.retry_model.name}')
current_model = ut.retry_model
max_attempts = attempts + ut.retry_count // Reset max attempts for the retry model
}
println('Attempt ${attempts} with model ${current_model.name}')
// Prepare the prompt with error feedback if this is a retry
mut current_prompt := prompt
if last_error != '' {
current_prompt = 'Previous attempt failed with error: ${last_error}\n\n${prompt}'
}
// Call the AI model
response := call_ai_model(current_prompt, current_model) or {
println('AI call failed: ${err}')
last_error = err.str()
continue // Try again
}
// Process the response with the callback function
result := ut.callback_function(response) or {
// If callback returns an error, retry with the error message
println('Callback returned error: ${err}')
last_error = err.str()
continue // Try again
}
// If we get here, the callback was successful
return result
}
return error('Failed to execute unit task after ${attempts} attempts. Last error: ${last_error}')
}
```
### 3.4 models.v
```v
module escalayer
// ModelConfig defines the configuration for an AI model
pub struct ModelConfig {
pub mut:
name string
provider string
temperature f32
max_tokens int
}
// Call an AI model using OpenRouter
fn call_ai_model(prompt string, model ModelConfig)! string {
// Get OpenAI client (configured for OpenRouter)
mut client := get_openrouter_client()!
// Create the message for the AI
mut m := openai.Messages{
messages: [
openai.Message{
role: .system
content: 'You are a helpful assistant.'
},
openai.Message{
role: .user
content: prompt
}
]
}
// Call the AI model
res := client.chat_completion(
msgs: m,
model: model.name,
temperature: model.temperature,
max_completion_tokens: model.max_tokens
)!
// Extract the response content
if res.choices.len > 0 {
return res.choices[0].message.content
}
return error('No response from AI model')
}
```
### 3.5 openrouter.v
```v
module escalayer
import freeflowuniverse.herolib.clients.openai
import os
// Get an OpenAI client configured for OpenRouter
fn get_openrouter_client()! &openai.OpenAI {
// Get API key from environment variable
api_key := os.getenv('OPENROUTER_API_KEY')
if api_key == '' {
return error('OPENROUTER_API_KEY environment variable not set')
}
// Create OpenAI client with OpenRouter base URL
mut client := openai.new(
api_key: api_key,
base_url: 'https://openrouter.ai/api/v1'
)!
return client
}
```
## 4. Usage Example
```v
import freeflowuniverse.herolib.mcp.aitools.escalayer
fn main() {
// Create a new task
mut task := escalayer.new_task(
name: 'rhai_wrapper_creator'
description: 'Create Rhai wrappers for Rust functions'
)
// Define the unit tasks
task.new_unit_task(
name: 'separate_functions'
prompt_function: separate_functions
callback_function: process_functions
)
task.new_unit_task(
name: 'create_wrappers'
prompt_function: create_wrappers
callback_function: process_wrappers
retry_count: 2
)
task.new_unit_task(
name: 'create_tests'
prompt_function: create_tests
callback_function: process_tests
base_model: escalayer.ModelConfig{
name: 'claude-3-haiku-20240307'
provider: 'anthropic'
temperature: 0.5
max_tokens: 4000
}
)
// Initiate the task
result := task.initiate('path/to/rust/file.rs') or {
println('Task failed: ${err}')
return
}
println('Task completed successfully')
println(result)
}
// Define the prompt functions
fn separate_functions(input string) string {
return 'Read rust file and separate it into functions ${input}'
}
fn create_wrappers(input string) string {
return 'Create rhai wrappers for rust functions ${input}'
}
fn create_tests(input string) string {
return 'Create tests for rhai wrappers ${input}'
}
// Define the callback functions
fn process_functions(response string)! string {
// Process the AI response
// Return error if processing fails
if response.contains('error') {
return error('Failed to process functions: Invalid response format')
}
return response
}
fn process_wrappers(response string)! string {
// Process the AI response
// Return error if processing fails
if !response.contains('fn') {
return error('Failed to process wrappers: No functions found')
}
return response
}
fn process_tests(response string)! string {
// Process the AI response
// Return error if processing fails
if !response.contains('test') {
return error('Failed to process tests: No tests found')
}
return response
}
```
## 5. Key Features and Benefits
1. **V-Idiomatic Design**: Uses V's `@[params]` structures for configuration and the V result type (`fn ()!`) for error handling.
2. **Sequential Task Execution**: Tasks are executed in sequence, with each unit task building on the results of the previous one.
3. **Automatic Model Escalation**: If a unit task fails with a cheaper model, the system automatically retries with a more powerful model.
4. **Flexible Configuration**: Each unit task can be configured with different models, retry counts, and other parameters.
5. **Error Handling**: Comprehensive error handling with detailed error messages and retry mechanisms using V's built-in error handling.
6. **Callback Processing**: Custom callback functions allow for validation and processing of AI responses.
7. **OpenRouter Integration**: Uses OpenRouter to access a wide range of AI models from different providers.
## 6. Future Enhancements
1. **Parallel Execution**: Add support for executing unit tasks in parallel when they don't depend on each other.
2. **Caching**: Implement caching of AI responses to avoid redundant API calls.
3. **Cost Tracking**: Add functionality to track and report on API usage costs.
4. **Timeout Handling**: Add support for timeouts and graceful handling of long-running tasks.
5. **Streaming Responses**: Support for streaming AI responses for long-form content generation.
6. **Prompt Templates**: Add support for reusable prompt templates.
7. **Logging and Monitoring**: Enhanced logging and monitoring capabilities.

View File

@@ -0,0 +1,529 @@
#!/usr/bin/env -S v -n -w -gc none -cc tcc -d use_openssl -enable-globals run
import freeflowuniverse.herolib.mcp.aitools.escalayer
import os
fn main() {
// Get the current directory
current_dir := os.dir(@FILE)
// Check if a source code path was provided as an argument
if os.args.len < 2 {
println('Please provide the path to the source code directory as an argument')
println('Example: ./example.vsh /path/to/source/code/directory')
return
}
// Get the source code path from the command line arguments
source_code_path := os.args[1]
// Check if the path exists and is a directory
if !os.exists(source_code_path) {
println('Source code path does not exist: ${source_code_path}')
return
}
if !os.is_dir(source_code_path) {
println('Source code path is not a directory: ${source_code_path}')
return
}
// Get all Rust files in the directory
files := os.ls(source_code_path) or {
println('Failed to list files in directory: ${err}')
return
}
// Combine all Rust files into a single source code string
mut source_code := ''
for file in files {
file_path := os.join_path(source_code_path, file)
// Skip directories and non-Rust files
if os.is_dir(file_path) || !file.ends_with('.rs') {
continue
}
// Read the file content
file_content := os.read_file(file_path) or {
println('Failed to read file ${file_path}: ${err}')
continue
}
// Add file content to the combined source code
source_code += '// File: ${file}\n${file_content}\n\n'
}
if source_code == '' {
println('No Rust files found in directory: ${source_code_path}')
return
}
// Read the rhaiwrapping.md file
rhai_wrapping_md := os.read_file('/Users/timurgordon/code/git.ourworld.tf/herocode/sal/aiprompts/rhaiwrapping.md') or {
println('Failed to read rhaiwrapping.md: ${err}')
return
}
// Determine the crate path from the source code path
// Extract the path relative to the src directory
src_index := source_code_path.index('src/') or {
println('Could not determine crate path: src/ not found in path')
return
}
mut path_parts := source_code_path[src_index+4..].split('/')
// Remove the last part (the file name)
if path_parts.len > 0 {
path_parts.delete_last()
}
rel_path := path_parts.join('::')
crate_path := 'sal::${rel_path}'
// Create a new task
mut task := escalayer.new_task(
name: 'rhai_wrapper_creator.escalayer'
description: 'Create Rhai wrappers for Rust functions that follow builder pattern and create examples corresponding to the provided example file'
)
// Create model configs
sonnet_model := escalayer.ModelConfig{
name: 'anthropic/claude-3.7-sonnet'
provider: 'anthropic'
temperature: 0.7
max_tokens: 25000
}
gpt4_model := escalayer.ModelConfig{
name: 'gpt-4'
provider: 'openai'
temperature: 0.7
max_tokens: 25000
}
// Extract the module name from the directory path (last component)
dir_parts := source_code_path.split('/')
name := dir_parts[dir_parts.len - 1]
// Create the prompt with source code, wrapper example, and rhai_wrapping_md
prompt_content := create_rhai_wrappers(name, source_code, os.read_file('${current_dir}/prompts/example_script.md') or { '' }, os.read_file('${current_dir}/prompts/wrapper.md') or { '' }, os.read_file('${current_dir}/prompts/errors.md') or { '' }, crate_path)
// Create a prompt function that returns the prepared content
prompt_function := fn [prompt_content] (input string) string {
return prompt_content
}
gen := RhaiGen{
name: name
dir: source_code_path
}
// Define a single unit task that handles everything
task.new_unit_task(
name: 'create_rhai_wrappers'
prompt_function: prompt_function
callback_function: gen.process_rhai_wrappers
base_model: sonnet_model
retry_model: gpt4_model
retry_count: 1
)
// Initiate the task
result := task.initiate('') or {
println('Task failed: ${err}')
return
}
println('Task completed successfully')
println('The wrapper files have been generated and compiled in the target directory.')
println('Check /Users/timurgordon/code/git.ourworld.tf/herocode/sal/src/rhai for the compiled output.')
}
// Define the prompt functions
fn separate_functions(input string) string {
return 'Read the following Rust code and separate it into functions. Identify all the methods in the Container implementation and their purposes.\n\n${input}'
}
fn create_wrappers(input string) string {
return 'Create Rhai wrappers for the Rust functions identified in the previous step. The wrappers should follow the builder pattern and provide a clean API for use in Rhai scripts. Include error handling and type conversion.\n\n${input}'
}
fn create_example(input string) string {
return 'Create a Rhai example script that demonstrates how to use the wrapper functions. The example should be based on the provided example.rs file but adapted for Rhai syntax. Create a web server example that uses the container functions.\n\n${input}'
}
// Define a Rhai wrapper generator function for Container functions
fn create_rhai_wrappers(name string, source_code string, example_rhai string, wrapper_md string, errors_md string, crate_path string) string {
guides := os.read_file('/Users/timurgordon/code/git.ourworld.tf/herocode/sal/aiprompts/rhaiwrapping_classicai.md') or { panic('Failed to read guides') }
engine := $tmpl('./prompts/engine.md')
vector_vs_array := os.read_file('/Users/timurgordon/code/git.ourworld.tf/herocode/sal/aiprompts/rhai_array_vs_vector.md') or { panic('Failed to read guides') }
rhai_integration_fixes := os.read_file('/Users/timurgordon/code/git.ourworld.tf/herocode/sal/aiprompts/rhai_integration_fixes.md') or { panic('Failed to read guides') }
rhai_syntax_guide := os.read_file('/Users/timurgordon/code/git.ourworld.tf/herocode/sal/aiprompts/rhai_syntax_guide.md') or { panic('Failed to read guides') }
generic_wrapper_rs := $tmpl('./templates/generic_wrapper.rs')
return 'You are a Rust developer tasked with creating Rhai wrappers for Rust functions. Please review the following best practices for Rhai wrappers and then create the necessary files.
${guides}
${vector_vs_array}
${example_rhai}
${wrapper_md}
## Common Errors to Avoid
${errors_md}
${rhai_integration_fixes}
${rhai_syntax_guide}
## Your Task
Please create a wrapper.rs file that implements Rhai wrappers for the provided Rust code, and an example.rhai script that demonstrates how to use these wrappers:
## Rust Code to Wrap
```rust
${source_code}
```
IMPORTANT NOTES:
1. For Rhai imports, use: `use rhai::{Engine, EvalAltResult, plugin::*, Dynamic, Map, Array};` - only import what you actually use
2. The following dependencies are available in Cargo.toml:
- rhai = "1.21.0"
- serde = { version = "1.0", features = ["derive"] }
- serde_json = "1.0"
- sal = { path = "../../../" }
3. For the wrapper: `use sal::${name};` this way you can access the module functions and objects with ${name}::
4. The generic_wrapper.rs file will be hardcoded into the package, you can use code from there.
```rust
${generic_wrapper_rs}
```
5. IMPORTANT: Prefer strongly typed return values over Dynamic types whenever possible. Only use Dynamic when absolutely necessary.
- For example, return `Result<String, Box<EvalAltResult>>` instead of `Dynamic` when a function returns a string
- Use `Result<bool, Box<EvalAltResult>>` instead of `Dynamic` when a function returns a boolean
- Use `Result<Vec<String>, Box<EvalAltResult>>` instead of `Dynamic` when a function returns a list of strings
6. Your code should include public functions that can be called from Rhai scripts
7. Make sure to implement all necessary helper functions for type conversion
8. DO NOT use the #[rhai_fn] attribute - functions will be registered directly in the engine
9. Make sure to handle string type consistency - use String::from() for string literals when returning in match arms with format!() strings
10. When returning path references, convert them to owned strings (e.g., path().to_string())
11. For error handling, use proper Result types with Box<EvalAltResult> for the error type:
```rust
// INCORRECT:
pub fn some_function(arg: &str) -> Dynamic {
match some_operation(arg) {
Ok(result) => Dynamic::from(result),
Err(err) => Dynamic::from(format!("Error: {}", err))
}
}
// CORRECT:
pub fn some_function(arg: &str) -> Result<String, Box<EvalAltResult>> {
some_operation(arg).map_err(|err| {
Box::new(EvalAltResult::ErrorRuntime(
format!("Error: {}", err).into(),
rhai::Position::NONE
))
})
}
```
12. IMPORTANT: Format your response with the code between triple backticks as follows:
```rust
// wrapper.rs
// Your wrapper implementation here
```
```rust
// engine.rs
// Your engine.rs implementation here
```
```rhai
// example.rhai
// Your example Rhai script here
```
13. The example.rhai script should demonstrate the use of all the wrapper functions you create
14. The engine.rs file should contain a register_module function that registers all the wrapper functions and types with the Rhai engine, and a create function. For example:
${engine}
MOST IMPORTANT:
import package being wrapped as `use sal::<name>`
your engine create function is called `create_rhai_engine`
```
'
}
@[params]
pub struct WrapperModule {
pub:
lib_rs string
example_rs string
engine_rs string
cargo_toml string
example_rhai string
generic_wrapper_rs string
wrapper_rs string
}
// functions is a list of function names that AI should extract and pass in
fn create_wrapper_module(wrapper WrapperModule, functions []string, name_ string, base_dir string)! string {
// Define project directory paths
name := name_
project_dir := '${base_dir}/rhai'
// Create the project using cargo new --lib
if os.exists(project_dir) {
os.rmdir_all(project_dir) or {
return error('Failed to clean existing project directory: ${err}')
}
}
// Run cargo new --lib to create the project
os.chdir(base_dir) or {
return error('Failed to change directory to base directory: ${err}')
}
cargo_new_result := os.execute('cargo new --lib rhai')
if cargo_new_result.exit_code != 0 {
return error('Failed to create new library project: ${cargo_new_result.output}')
}
// Create examples directory
examples_dir := '${project_dir}/examples'
os.mkdir_all(examples_dir) or {
return error('Failed to create examples directory: ${err}')
}
// Write the lib.rs file
if wrapper.lib_rs != '' {
os.write_file('${project_dir}/src/lib.rs', wrapper.lib_rs) or {
return error('Failed to write lib.rs: ${err}')
}
}
// Write the wrapper.rs file
if wrapper.wrapper_rs != '' {
os.write_file('${project_dir}/src/wrapper.rs', wrapper.wrapper_rs) or {
return error('Failed to write wrapper.rs: ${err}')
}
}
// Write the generic wrapper.rs file
if wrapper.generic_wrapper_rs != '' {
os.write_file('${project_dir}/src/generic_wrapper.rs', wrapper.generic_wrapper_rs) or {
return error('Failed to write generic wrapper.rs: ${err}')
}
}
// Write the example.rs file
if wrapper.example_rs != '' {
os.write_file('${examples_dir}/example.rs', wrapper.example_rs) or {
return error('Failed to write example.rs: ${err}')
}
}
// Write the engine.rs file if provided
if wrapper.engine_rs != '' {
os.write_file('${project_dir}/src/engine.rs', wrapper.engine_rs) or {
return error('Failed to write engine.rs: ${err}')
}
}
// Write the Cargo.toml file
if wrapper.cargo_toml != '' {
os.write_file('${project_dir}/Cargo.toml', wrapper.cargo_toml) or {
return error('Failed to write Cargo.toml: ${err}')
}
}
// Write the example.rhai file if provided
if wrapper.example_rhai != '' {
os.write_file('${examples_dir}/example.rhai', wrapper.example_rhai) or {
return error('Failed to write example.rhai: ${err}')
}
}
return project_dir
}
// Helper function to extract code blocks from the response
fn extract_code_block(response string, identifier string, language string) string {
// Find the start marker for the code block
mut start_marker := '```${language}\n// ${identifier}'
if language == '' {
start_marker = '```\n// ${identifier}'
}
start_index := response.index(start_marker) or {
// Try alternative format
mut alt_marker := '```${language}\n${identifier}'
if language == '' {
alt_marker = '```\n${identifier}'
}
response.index(alt_marker) or {
return ''
}
}
// Find the end marker
end_marker := '```'
end_index := response.index_after(end_marker, start_index + start_marker.len) or {
return ''
}
// Extract the content between the markers
content_start := start_index + start_marker.len
content := response[content_start..end_index].trim_space()
return content
}
// Extract module name from wrapper code
fn extract_module_name(code string) string {
lines := code.split('\n')
for line in lines {
// Look for pub mod or mod declarations
if line.contains('pub mod ') || line.contains('mod ') {
// Extract module name
mut parts := []string{}
if line.contains('pub mod ') {
parts = line.split('pub mod ')
} else {
parts = line.split('mod ')
}
if parts.len > 1 {
// Extract the module name and remove any trailing characters
mut name := parts[1].trim_space()
// Remove any trailing { or ; or whitespace
name = name.trim_right('{').trim_right(';').trim_space()
if name != '' {
return name
}
}
}
}
return ''
}
struct RhaiGen {
name string
dir string
}
// Define the callback function that processes the response and compiles the code
fn (gen RhaiGen)process_rhai_wrappers(response string)! string {
// Extract wrapper.rs content
wrapper_rs_content := extract_code_block(response, 'wrapper.rs', 'rust')
if wrapper_rs_content == '' {
return error('Failed to extract wrapper.rs content from response. Please ensure your code is properly formatted inside a code block that starts with ```rust\n// wrapper.rs and ends with ```')
}
// Extract engine.rs content
mut engine_rs_content := extract_code_block(response, 'engine.rs', 'rust')
if engine_rs_content == '' {
// Try to extract from the response without explicit language marker
engine_rs_content = extract_code_block(response, 'engine.rs', '')
// if engine_rs_content == '' {
// // Use the template engine.rs
// engine_rs_content = $tmpl('./templates/engine.rs')
// }
}
// Extract example.rhai content
mut example_rhai_content := extract_code_block(response, 'example.rhai', 'rhai')
if example_rhai_content == '' {
// Try to extract from the response without explicit language marker
example_rhai_content = extract_code_block(response, 'example.rhai', '')
if example_rhai_content == '' {
// Use the example from the template
example_script_md := os.read_file('${os.dir(@FILE)}/prompts/example_script.md') or {
return error('Failed to read example.rhai template: ${err}')
}
// Extract the code block from the markdown file
example_rhai_content = extract_code_block(example_script_md, 'example.rhai', 'rhai')
if example_rhai_content == '' {
return error('Failed to extract example.rhai from template file')
}
}
}
// Extract function names from the wrapper.rs content
functions := extract_functions_from_code(wrapper_rs_content)
println('Using module name: ${gen.name}_rhai')
println('Extracted functions: ${functions.join(", ")}')
name := gen.name
// Create a WrapperModule struct with the extracted content
wrapper := WrapperModule{
lib_rs: $tmpl('./templates/lib.rs')
wrapper_rs: wrapper_rs_content
example_rs: $tmpl('./templates/example.rs')
engine_rs: engine_rs_content
generic_wrapper_rs: $tmpl('./templates/generic_wrapper.rs')
cargo_toml: $tmpl('./templates/cargo.toml')
example_rhai: example_rhai_content
}
// Create the wrapper module
base_target_dir := gen.dir
project_dir := create_wrapper_module(wrapper, functions, gen.name, base_target_dir) or {
return error('Failed to create wrapper module: ${err}')
}
// Run the example
os.chdir(project_dir) or {
return error('Failed to change directory to project: ${err}')
}
// Run cargo build first
build_result := os.execute('cargo build')
if build_result.exit_code != 0 {
return error('Compilation failed. Please fix the following errors and ensure your code is compatible with the existing codebase:\n\n${build_result.output}')
}
// Run the example
run_result := os.execute('cargo run --example example')
return 'Successfully generated Rhai wrappers and ran the example!\n\nProject created at: ${project_dir}\n\nBuild output:\n${build_result.output}\n\nRun output:\n${run_result.output}'
}
// Extract function names from wrapper code
fn extract_functions_from_code(code string) []string {
mut functions := []string{}
lines := code.split('\n')
for line in lines {
if line.contains('pub fn ') && !line.contains('//') {
// Extract function name
parts := line.split('pub fn ')
if parts.len > 1 {
name_parts := parts[1].split('(')
if name_parts.len > 0 {
fn_name := name_parts[0].trim_space()
if fn_name != '' {
functions << fn_name
}
}
}
}
}
return functions
}

View File

@@ -0,0 +1,617 @@
#!/usr/bin/env -S v -n -w -gc none -cc tcc -d use_openssl -enable-globals run
import freeflowuniverse.herolib.mcp.aitools.escalayer
import os
fn main() {
// Get the current directory where this script is located
current_dir := os.dir(@FILE)
// Validate command line arguments
source_code_path := validate_command_args() or {
println(err)
return
}
// Read and combine all Rust files in the source directory
source_code := read_source_code(source_code_path) or {
println(err)
return
}
// Determine the crate path from the source code path
crate_path := determine_crate_path(source_code_path) or {
println(err)
return
}
// Extract the module name from the directory path (last component)
name := extract_module_name_from_path(source_code_path)
// Create the prompt content for the AI
prompt_content := create_rhai_wrappers(
name,
source_code,
read_file_safely('${current_dir}/prompts/example_script.md'),
read_file_safely('${current_dir}/prompts/wrapper.md'),
read_file_safely('${current_dir}/prompts/errors.md'),
crate_path
)
// Create the generator instance
gen := RhaiGen{
name: name
dir: source_code_path
}
// Run the task to generate Rhai wrappers
run_wrapper_generation_task(prompt_content, gen) or {
println('Task failed: ${err}')
return
}
println('Task completed successfully')
println('The wrapper files have been generated and compiled in the target directory.')
println('Check /Users/timurgordon/code/git.ourworld.tf/herocode/sal/src/rhai for the compiled output.')
}
// Validates command line arguments and returns the source code path
fn validate_command_args() !string {
if os.args.len < 2 {
return error('Please provide the path to the source code directory as an argument\nExample: ./example.vsh /path/to/source/code/directory')
}
source_code_path := os.args[1]
if !os.exists(source_code_path) {
return error('Source code path does not exist: ${source_code_path}')
}
if !os.is_dir(source_code_path) {
return error('Source code path is not a directory: ${source_code_path}')
}
return source_code_path
}
// Reads and combines all Rust files in the given directory
fn read_source_code(source_code_path string) !string {
// Get all files in the directory
files := os.ls(source_code_path) or {
return error('Failed to list files in directory: ${err}')
}
// Combine all Rust files into a single source code string
mut source_code := ''
for file in files {
file_path := os.join_path(source_code_path, file)
// Skip directories and non-Rust files
if os.is_dir(file_path) || !file.ends_with('.rs') {
continue
}
// Read the file content
file_content := os.read_file(file_path) or {
println('Failed to read file ${file_path}: ${err}')
continue
}
// Add file content to the combined source code
source_code += '// File: ${file}\n${file_content}\n\n'
}
if source_code == '' {
return error('No Rust files found in directory: ${source_code_path}')
}
return source_code
}
// Determines the crate path from the source code path
fn determine_crate_path(source_code_path string) !string {
// Extract the path relative to the src directory
src_index := source_code_path.index('src/') or {
return error('Could not determine crate path: src/ not found in path')
}
mut path_parts := source_code_path[src_index+4..].split('/')
// Remove the last part (the file name)
if path_parts.len > 0 {
path_parts.delete_last()
}
rel_path := path_parts.join('::')
return 'sal::${rel_path}'
}
// Extracts the module name from a directory path
fn extract_module_name_from_path(path string) string {
dir_parts := path.split('/')
return dir_parts[dir_parts.len - 1]
}
// Helper function to read a file or return empty string if file doesn't exist
fn read_file_safely(file_path string) string {
return os.read_file(file_path) or { '' }
}
// Runs the task to generate Rhai wrappers
fn run_wrapper_generation_task(prompt_content string, gen RhaiGen) !string {
// Create a new task
mut task := escalayer.new_task(
name: 'rhai_wrapper_creator.escalayer'
description: 'Create Rhai wrappers for Rust functions that follow builder pattern and create examples corresponding to the provided example file'
)
// Create model configs
sonnet_model := escalayer.ModelConfig{
name: 'anthropic/claude-3.7-sonnet'
provider: 'anthropic'
temperature: 0.7
max_tokens: 25000
}
gpt4_model := escalayer.ModelConfig{
name: 'gpt-4'
provider: 'openai'
temperature: 0.7
max_tokens: 25000
}
// Create a prompt function that returns the prepared content
prompt_function := fn [prompt_content] (input string) string {
return prompt_content
}
// Define a single unit task that handles everything
task.new_unit_task(
name: 'create_rhai_wrappers'
prompt_function: prompt_function
callback_function: gen.process_rhai_wrappers
base_model: sonnet_model
retry_model: gpt4_model
retry_count: 1
)
// Initiate the task
return task.initiate('')
}
// Define a Rhai wrapper generator function for Container functions
fn create_rhai_wrappers(name string, source_code string, example_rhai string, wrapper_md string, errors_md string, crate_path string) string {
// Load all required template and guide files
guides := load_guide_file('/Users/timurgordon/code/git.ourworld.tf/herocode/sal/aiprompts/rhaiwrapping_classicai.md')
engine := $tmpl('./prompts/engine.md')
vector_vs_array := load_guide_file('/Users/timurgordon/code/git.ourworld.tf/herocode/sal/aiprompts/rhai_array_vs_vector.md')
rhai_integration_fixes := load_guide_file('/Users/timurgordon/code/git.ourworld.tf/herocode/sal/aiprompts/rhai_integration_fixes.md')
rhai_syntax_guide := load_guide_file('/Users/timurgordon/code/git.ourworld.tf/herocode/sal/aiprompts/rhai_syntax_guide.md')
generic_wrapper_rs := $tmpl('./templates/generic_wrapper.rs')
// Build the prompt content
return build_prompt_content(name, source_code, example_rhai, wrapper_md, errors_md,
guides, vector_vs_array, rhai_integration_fixes, rhai_syntax_guide,
generic_wrapper_rs, engine)
}
// Helper function to load guide files with error handling
fn load_guide_file(path string) string {
return os.read_file(path) or {
eprintln('Warning: Failed to read guide file: ${path}')
return ''
}
}
// Builds the prompt content for the AI
fn build_prompt_content(name string, source_code string, example_rhai string, wrapper_md string,
errors_md string, guides string, vector_vs_array string,
rhai_integration_fixes string, rhai_syntax_guide string,
generic_wrapper_rs string, engine string) string {
return 'You are a Rust developer tasked with creating Rhai wrappers for Rust functions. Please review the following best practices for Rhai wrappers and then create the necessary files.
${guides}
${vector_vs_array}
${example_rhai}
${wrapper_md}
## Common Errors to Avoid
${errors_md}
${rhai_integration_fixes}
${rhai_syntax_guide}
## Your Task
Please create a wrapper.rs file that implements Rhai wrappers for the provided Rust code, and an example.rhai script that demonstrates how to use these wrappers:
## Rust Code to Wrap
```rust
${source_code}
```
IMPORTANT NOTES:
1. For Rhai imports, use: `use rhai::{Engine, EvalAltResult, plugin::*, Dynamic, Map, Array};` - only import what you actually use
2. The following dependencies are available in Cargo.toml:
- rhai = "1.21.0"
- serde = { version = "1.0", features = ["derive"] }
- serde_json = "1.0"
- sal = { path = "../../../" }
3. For the wrapper: `use sal::${name};` this way you can access the module functions and objects with ${name}::
4. The generic_wrapper.rs file will be hardcoded into the package, you can use code from there.
```rust
${generic_wrapper_rs}
```
5. IMPORTANT: Prefer strongly typed return values over Dynamic types whenever possible. Only use Dynamic when absolutely necessary.
- For example, return `Result<String, Box<EvalAltResult>>` instead of `Dynamic` when a function returns a string
- Use `Result<bool, Box<EvalAltResult>>` instead of `Dynamic` when a function returns a boolean
- Use `Result<Vec<String>, Box<EvalAltResult>>` instead of `Dynamic` when a function returns a list of strings
6. Your code should include public functions that can be called from Rhai scripts
7. Make sure to implement all necessary helper functions for type conversion
8. DO NOT use the #[rhai_fn] attribute - functions will be registered directly in the engine
9. Make sure to handle string type consistency - use String::from() for string literals when returning in match arms with format!() strings
10. When returning path references, convert them to owned strings (e.g., path().to_string())
11. For error handling, use proper Result types with Box<EvalAltResult> for the error type:
```rust
// INCORRECT:
pub fn some_function(arg: &str) -> Dynamic {
match some_operation(arg) {
Ok(result) => Dynamic::from(result),
Err(err) => Dynamic::from(format!("Error: {}", err))
}
}
// CORRECT:
pub fn some_function(arg: &str) -> Result<String, Box<EvalAltResult>> {
some_operation(arg).map_err(|err| {
Box::new(EvalAltResult::ErrorRuntime(
format!("Error: {}", err).into(),
rhai::Position::NONE
))
})
}
```
12. IMPORTANT: Format your response with the code between triple backticks as follows:
```rust
// wrapper.rs
// Your wrapper implementation here
```
```rust
// engine.rs
// Your engine.rs implementation here
```
```rhai
// example.rhai
// Your example Rhai script here
```
13. The example.rhai script should demonstrate the use of all the wrapper functions you create
14. The engine.rs file should contain a register_module function that registers all the wrapper functions and types with the Rhai engine, and a create function. For example:
${engine}
MOST IMPORTANT:
import package being wrapped as `use sal::<n>`
your engine create function is called `create_rhai_engine`
```
'
}
@[params]
pub struct WrapperModule {
pub:
lib_rs string
example_rs string
engine_rs string
cargo_toml string
example_rhai string
generic_wrapper_rs string
wrapper_rs string
}
// functions is a list of function names that AI should extract and pass in
fn create_wrapper_module(wrapper WrapperModule, functions []string, name_ string, base_dir string)! string {
// Define project directory paths
name := name_
project_dir := '${base_dir}/rhai'
// Create the project using cargo new --lib
if os.exists(project_dir) {
os.rmdir_all(project_dir) or {
return error('Failed to clean existing project directory: ${err}')
}
}
// Run cargo new --lib to create the project
os.chdir(base_dir) or {
return error('Failed to change directory to base directory: ${err}')
}
cargo_new_result := os.execute('cargo new --lib rhai')
if cargo_new_result.exit_code != 0 {
return error('Failed to create new library project: ${cargo_new_result.output}')
}
// Create examples directory
examples_dir := '${project_dir}/examples'
os.mkdir_all(examples_dir) or {
return error('Failed to create examples directory: ${err}')
}
// Write the lib.rs file
if wrapper.lib_rs != '' {
os.write_file('${project_dir}/src/lib.rs', wrapper.lib_rs) or {
return error('Failed to write lib.rs: ${err}')
}
}
// Write the wrapper.rs file
if wrapper.wrapper_rs != '' {
os.write_file('${project_dir}/src/wrapper.rs', wrapper.wrapper_rs) or {
return error('Failed to write wrapper.rs: ${err}')
}
}
// Write the generic wrapper.rs file
if wrapper.generic_wrapper_rs != '' {
os.write_file('${project_dir}/src/generic_wrapper.rs', wrapper.generic_wrapper_rs) or {
return error('Failed to write generic wrapper.rs: ${err}')
}
}
// Write the example.rs file
if wrapper.example_rs != '' {
os.write_file('${examples_dir}/example.rs', wrapper.example_rs) or {
return error('Failed to write example.rs: ${err}')
}
}
// Write the engine.rs file if provided
if wrapper.engine_rs != '' {
os.write_file('${project_dir}/src/engine.rs', wrapper.engine_rs) or {
return error('Failed to write engine.rs: ${err}')
}
}
// Write the Cargo.toml file
if wrapper.cargo_toml != '' {
os.write_file('${project_dir}/Cargo.toml', wrapper.cargo_toml) or {
return error('Failed to write Cargo.toml: ${err}')
}
}
// Write the example.rhai file if provided
if wrapper.example_rhai != '' {
os.write_file('${examples_dir}/example.rhai', wrapper.example_rhai) or {
return error('Failed to write example.rhai: ${err}')
}
}
return project_dir
}
// Helper function to extract code blocks from the response
fn extract_code_block(response string, identifier string, language string) string {
// Find the start marker for the code block
mut start_marker := '```${language}\n// ${identifier}'
if language == '' {
start_marker = '```\n// ${identifier}'
}
start_index := response.index(start_marker) or {
// Try alternative format
mut alt_marker := '```${language}\n${identifier}'
if language == '' {
alt_marker = '```\n${identifier}'
}
response.index(alt_marker) or {
return ''
}
}
// Find the end marker
end_marker := '```'
end_index := response.index_after(end_marker, start_index + start_marker.len) or {
return ''
}
// Extract the content between the markers
content_start := start_index + start_marker.len
content := response[content_start..end_index].trim_space()
return content
}
// Extract module name from wrapper code
fn extract_module_name(code string) string {
lines := code.split('\n')
for line in lines {
// Look for pub mod or mod declarations
if line.contains('pub mod ') || line.contains('mod ') {
// Extract module name
mut parts := []string{}
if line.contains('pub mod ') {
parts = line.split('pub mod ')
} else {
parts = line.split('mod ')
}
if parts.len > 1 {
// Extract the module name and remove any trailing characters
mut name := parts[1].trim_space()
// Remove any trailing { or ; or whitespace
name = name.trim_right('{').trim_right(';').trim_space()
if name != '' {
return name
}
}
}
}
return ''
}
// RhaiGen struct for generating Rhai wrappers
struct RhaiGen {
name string
dir string
}
// Process the AI response and compile the generated code
fn (gen RhaiGen)process_rhai_wrappers(response string)! string {
// Extract code blocks from the response
code_blocks := extract_code_blocks(response) or {
return err
}
// Extract function names from the wrapper.rs content
functions := extract_functions_from_code(code_blocks.wrapper_rs)
println('Using module name: ${gen.name}_rhai')
println('Extracted functions: ${functions.join(", ")}')
name := gen.name
// Create a WrapperModule struct with the extracted content
wrapper := WrapperModule{
lib_rs: $tmpl('./templates/lib.rs')
wrapper_rs: code_blocks.wrapper_rs
example_rs: $tmpl('./templates/example.rs')
engine_rs: code_blocks.engine_rs
generic_wrapper_rs: $tmpl('./templates/generic_wrapper.rs')
cargo_toml: $tmpl('./templates/cargo.toml')
example_rhai: code_blocks.example_rhai
}
// Create the wrapper module
project_dir := create_wrapper_module(wrapper, functions, gen.name, gen.dir) or {
return error('Failed to create wrapper module: ${err}')
}
// Build and run the project
build_output, run_output := build_and_run_project(project_dir) or {
return err
}
return format_success_message(project_dir, build_output, run_output)
}
// CodeBlocks struct to hold extracted code blocks
struct CodeBlocks {
wrapper_rs string
engine_rs string
example_rhai string
}
// Extract code blocks from the AI response
fn extract_code_blocks(response string)! CodeBlocks {
// Extract wrapper.rs content
wrapper_rs_content := extract_code_block(response, 'wrapper.rs', 'rust')
if wrapper_rs_content == '' {
return error('Failed to extract wrapper.rs content from response. Please ensure your code is properly formatted inside a code block that starts with ```rust\n// wrapper.rs and ends with ```')
}
// Extract engine.rs content
mut engine_rs_content := extract_code_block(response, 'engine.rs', 'rust')
if engine_rs_content == '' {
// Try to extract from the response without explicit language marker
engine_rs_content = extract_code_block(response, 'engine.rs', '')
}
// Extract example.rhai content
mut example_rhai_content := extract_code_block(response, 'example.rhai', 'rhai')
if example_rhai_content == '' {
// Try to extract from the response without explicit language marker
example_rhai_content = extract_code_block(response, 'example.rhai', '')
if example_rhai_content == '' {
// Use the example from the template
example_rhai_content = load_example_from_template() or {
return err
}
}
}
return CodeBlocks{
wrapper_rs: wrapper_rs_content
engine_rs: engine_rs_content
example_rhai: example_rhai_content
}
}
// Load example.rhai from template file
fn load_example_from_template()! string {
example_script_md := os.read_file('${os.dir(@FILE)}/prompts/example_script.md') or {
return error('Failed to read example.rhai template: ${err}')
}
// Extract the code block from the markdown file
example_rhai_content := extract_code_block(example_script_md, 'example.rhai', 'rhai')
if example_rhai_content == '' {
return error('Failed to extract example.rhai from template file')
}
return example_rhai_content
}
// Build and run the project
fn build_and_run_project(project_dir string)! (string, string) {
// Change to the project directory
os.chdir(project_dir) or {
return error('Failed to change directory to project: ${err}')
}
// Run cargo build first
build_result := os.execute('cargo build')
if build_result.exit_code != 0 {
return error('Compilation failed. Please fix the following errors and ensure your code is compatible with the existing codebase:\n\n${build_result.output}')
}
// Run the example
run_result := os.execute('cargo run --example example')
return build_result.output, run_result.output
}
// Format success message
fn format_success_message(project_dir string, build_output string, run_output string) string {
return 'Successfully generated Rhai wrappers and ran the example!\n\nProject created at: ${project_dir}\n\nBuild output:\n${build_output}\n\nRun output:\n${run_output}'
}
// Extract function names from wrapper code
fn extract_functions_from_code(code string) []string {
mut functions := []string{}
lines := code.split('\n')
for line in lines {
if line.contains('pub fn ') && !line.contains('//') {
// Extract function name
parts := line.split('pub fn ')
if parts.len > 1 {
name_parts := parts[1].split('(')
if name_parts.len > 0 {
fn_name := name_parts[0].trim_space()
if fn_name != '' {
functions << fn_name
}
}
}
}
}
return functions
}

View File

@@ -0,0 +1,510 @@
// File: /root/code/git.ourworld.tf/herocode/sal/src/virt/nerdctl/container_builder.rs
use std::collections::HashMap;
use crate::virt::nerdctl::{execute_nerdctl_command, NerdctlError};
use super::container_types::{Container, HealthCheck};
use super::health_check_script::prepare_health_check_command;
impl Container {
/// Reset the container configuration to defaults while keeping the name and image
/// If the container exists, it will be stopped and removed.
///
/// # Returns
///
/// * `Self` - The container instance for method chaining
pub fn reset(self) -> Self {
let name = self.name;
let image = self.image.clone();
// If container exists, stop and remove it
if let Some(container_id) = &self.container_id {
println!("Container exists. Stopping and removing container '{}'...", name);
// Try to stop the container
let _ = execute_nerdctl_command(&["stop", container_id]);
// Try to remove the container
let _ = execute_nerdctl_command(&["rm", container_id]);
}
// Create a new container with just the name and image, but no container_id
Self {
name,
container_id: None, // Reset container_id to None since we removed the container
image,
config: std::collections::HashMap::new(),
ports: Vec::new(),
volumes: Vec::new(),
env_vars: std::collections::HashMap::new(),
network: None,
network_aliases: Vec::new(),
cpu_limit: None,
memory_limit: None,
memory_swap_limit: None,
cpu_shares: None,
restart_policy: None,
health_check: None,
detach: false,
snapshotter: None,
}
}
/// Add a port mapping
///
/// # Arguments
///
/// * `port` - Port mapping (e.g., "8080:80")
///
/// # Returns
///
/// * `Self` - The container instance for method chaining
pub fn with_port(mut self, port: &str) -> Self {
self.ports.push(port.to_string());
self
}
/// Add multiple port mappings
///
/// # Arguments
///
/// * `ports` - Array of port mappings (e.g., ["8080:80", "8443:443"])
///
/// # Returns
///
/// * `Self` - The container instance for method chaining
pub fn with_ports(mut self, ports: &[&str]) -> Self {
for port in ports {
self.ports.push(port.to_string());
}
self
}
/// Add a volume mount
///
/// # Arguments
///
/// * `volume` - Volume mount (e.g., "/host/path:/container/path")
///
/// # Returns
///
/// * `Self` - The container instance for method chaining
pub fn with_volume(mut self, volume: &str) -> Self {
self.volumes.push(volume.to_string());
self
}
/// Add multiple volume mounts
///
/// # Arguments
///
/// * `volumes` - Array of volume mounts (e.g., ["/host/path1:/container/path1", "/host/path2:/container/path2"])
///
/// # Returns
///
/// * `Self` - The container instance for method chaining
pub fn with_volumes(mut self, volumes: &[&str]) -> Self {
for volume in volumes {
self.volumes.push(volume.to_string());
}
self
}
/// Add an environment variable
///
/// # Arguments
///
/// * `key` - Environment variable name
/// * `value` - Environment variable value
///
/// # Returns
///
/// * `Self` - The container instance for method chaining
pub fn with_env(mut self, key: &str, value: &str) -> Self {
self.env_vars.insert(key.to_string(), value.to_string());
self
}
/// Add multiple environment variables
///
/// # Arguments
///
/// * `env_map` - Map of environment variable names to values
///
/// # Returns
///
/// * `Self` - The container instance for method chaining
pub fn with_envs(mut self, env_map: &HashMap<&str, &str>) -> Self {
for (key, value) in env_map {
self.env_vars.insert(key.to_string(), value.to_string());
}
self
}
/// Set the network for the container
///
/// # Arguments
///
/// * `network` - Network name
///
/// # Returns
///
/// * `Self` - The container instance for method chaining
pub fn with_network(mut self, network: &str) -> Self {
self.network = Some(network.to_string());
self
}
/// Add a network alias for the container
///
/// # Arguments
///
/// * `alias` - Network alias
///
/// # Returns
///
/// * `Self` - The container instance for method chaining
pub fn with_network_alias(mut self, alias: &str) -> Self {
self.network_aliases.push(alias.to_string());
self
}
/// Add multiple network aliases for the container
///
/// # Arguments
///
/// * `aliases` - Array of network aliases
///
/// # Returns
///
/// * `Self` - The container instance for method chaining
pub fn with_network_aliases(mut self, aliases: &[&str]) -> Self {
for alias in aliases {
self.network_aliases.push(alias.to_string());
}
self
}
/// Set CPU limit for the container
///
/// # Arguments
///
/// * `cpus` - CPU limit (e.g., "0.5" for half a CPU, "2" for 2 CPUs)
///
/// # Returns
///
/// * `Self` - The container instance for method chaining
pub fn with_cpu_limit(mut self, cpus: &str) -> Self {
self.cpu_limit = Some(cpus.to_string());
self
}
/// Set memory limit for the container
///
/// # Arguments
///
/// * `memory` - Memory limit (e.g., "512m" for 512MB, "1g" for 1GB)
///
/// # Returns
///
/// * `Self` - The container instance for method chaining
pub fn with_memory_limit(mut self, memory: &str) -> Self {
self.memory_limit = Some(memory.to_string());
self
}
/// Set memory swap limit for the container
///
/// # Arguments
///
/// * `memory_swap` - Memory swap limit (e.g., "1g" for 1GB)
///
/// # Returns
///
/// * `Self` - The container instance for method chaining
pub fn with_memory_swap_limit(mut self, memory_swap: &str) -> Self {
self.memory_swap_limit = Some(memory_swap.to_string());
self
}
/// Set CPU shares for the container (relative weight)
///
/// # Arguments
///
/// * `shares` - CPU shares (e.g., "1024" for default, "512" for half)
///
/// # Returns
///
/// * `Self` - The container instance for method chaining
pub fn with_cpu_shares(mut self, shares: &str) -> Self {
self.cpu_shares = Some(shares.to_string());
self
}
/// Set restart policy for the container
///
/// # Arguments
///
/// * `policy` - Restart policy (e.g., "no", "always", "on-failure", "unless-stopped")
///
/// # Returns
///
/// * `Self` - The container instance for method chaining
pub fn with_restart_policy(mut self, policy: &str) -> Self {
self.restart_policy = Some(policy.to_string());
self
}
/// Set a simple health check for the container
///
/// # Arguments
///
/// * `cmd` - Command to run for health check (e.g., "curl -f http://localhost/ || exit 1")
///
/// # Returns
///
/// * `Self` - The container instance for method chaining
pub fn with_health_check(mut self, cmd: &str) -> Self {
// Use the health check script module to prepare the command
let prepared_cmd = prepare_health_check_command(cmd, &self.name);
self.health_check = Some(HealthCheck {
cmd: prepared_cmd,
interval: None,
timeout: None,
retries: None,
start_period: None,
});
self
}
/// Set a health check with custom options for the container
///
/// # Arguments
///
/// * `cmd` - Command to run for health check
/// * `interval` - Optional time between running the check (e.g., "30s", "1m")
/// * `timeout` - Optional maximum time to wait for a check to complete (e.g., "30s", "1m")
/// * `retries` - Optional number of consecutive failures needed to consider unhealthy
/// * `start_period` - Optional start period for the container to initialize before counting retries (e.g., "30s", "1m")
///
/// # Returns
///
/// * `Self` - The container instance for method chaining
pub fn with_health_check_options(
mut self,
cmd: &str,
interval: Option<&str>,
timeout: Option<&str>,
retries: Option<u32>,
start_period: Option<&str>,
) -> Self {
// Use the health check script module to prepare the command
let prepared_cmd = prepare_health_check_command(cmd, &self.name);
let mut health_check = HealthCheck {
cmd: prepared_cmd,
interval: None,
timeout: None,
retries: None,
start_period: None,
};
if let Some(interval_value) = interval {
health_check.interval = Some(interval_value.to_string());
}
if let Some(timeout_value) = timeout {
health_check.timeout = Some(timeout_value.to_string());
}
if let Some(retries_value) = retries {
health_check.retries = Some(retries_value);
}
if let Some(start_period_value) = start_period {
health_check.start_period = Some(start_period_value.to_string());
}
self.health_check = Some(health_check);
self
}
/// Set the snapshotter
///
/// # Arguments
///
/// * `snapshotter` - Snapshotter to use
///
/// # Returns
///
/// * `Self` - The container instance for method chaining
pub fn with_snapshotter(mut self, snapshotter: &str) -> Self {
self.snapshotter = Some(snapshotter.to_string());
self
}
/// Set whether to run in detached mode
///
/// # Arguments
///
/// * `detach` - Whether to run in detached mode
///
/// # Returns
///
/// * `Self` - The container instance for method chaining
pub fn with_detach(mut self, detach: bool) -> Self {
self.detach = detach;
self
}
/// Build the container
///
/// # Returns
///
/// * `Result<Self, NerdctlError>` - Container instance or error
pub fn build(self) -> Result<Self, NerdctlError> {
// If container already exists, return it
if self.container_id.is_some() {
return Ok(self);
}
// If no image is specified, return an error
let image = match &self.image {
Some(img) => img,
None => return Err(NerdctlError::Other("No image specified for container creation".to_string())),
};
// Build the command arguments as strings
let mut args_strings = Vec::new();
args_strings.push("run".to_string());
if self.detach {
args_strings.push("-d".to_string());
}
args_strings.push("--name".to_string());
args_strings.push(self.name.clone());
// Add port mappings
for port in &self.ports {
args_strings.push("-p".to_string());
args_strings.push(port.clone());
}
// Add volume mounts
for volume in &self.volumes {
args_strings.push("-v".to_string());
args_strings.push(volume.clone());
}
// Add environment variables
for (key, value) in &self.env_vars {
args_strings.push("-e".to_string());
args_strings.push(format!("{}={}", key, value));
}
// Add network configuration
if let Some(network) = &self.network {
args_strings.push("--network".to_string());
args_strings.push(network.clone());
}
// Add network aliases
for alias in &self.network_aliases {
args_strings.push("--network-alias".to_string());
args_strings.push(alias.clone());
}
// Add resource limits
if let Some(cpu_limit) = &self.cpu_limit {
args_strings.push("--cpus".to_string());
args_strings.push(cpu_limit.clone());
}
if let Some(memory_limit) = &self.memory_limit {
args_strings.push("--memory".to_string());
args_strings.push(memory_limit.clone());
}
if let Some(memory_swap_limit) = &self.memory_swap_limit {
args_strings.push("--memory-swap".to_string());
args_strings.push(memory_swap_limit.clone());
}
if let Some(cpu_shares) = &self.cpu_shares {
args_strings.push("--cpu-shares".to_string());
args_strings.push(cpu_shares.clone());
}
// Add restart policy
if let Some(restart_policy) = &self.restart_policy {
args_strings.push("--restart".to_string());
args_strings.push(restart_policy.clone());
}
// Add health check
if let Some(health_check) = &self.health_check {
args_strings.push("--health-cmd".to_string());
args_strings.push(health_check.cmd.clone());
if let Some(interval) = &health_check.interval {
args_strings.push("--health-interval".to_string());
args_strings.push(interval.clone());
}
if let Some(timeout) = &health_check.timeout {
args_strings.push("--health-timeout".to_string());
args_strings.push(timeout.clone());
}
if let Some(retries) = &health_check.retries {
args_strings.push("--health-retries".to_string());
args_strings.push(retries.to_string());
}
if let Some(start_period) = &health_check.start_period {
args_strings.push("--health-start-period".to_string());
args_strings.push(start_period.clone());
}
}
if let Some(snapshotter_value) = &self.snapshotter {
args_strings.push("--snapshotter".to_string());
args_strings.push(snapshotter_value.clone());
}
// Add flags to avoid BPF issues
args_strings.push("--cgroup-manager=cgroupfs".to_string());
args_strings.push(image.clone());
// Convert to string slices for the command
let args: Vec<&str> = args_strings.iter().map(|s| s.as_str()).collect();
// Execute the command
let result = execute_nerdctl_command(&args)?;
// Get the container ID from the output
let container_id = result.stdout.trim().to_string();
Ok(Self {
name: self.name,
container_id: Some(container_id),
image: self.image,
config: self.config,
ports: self.ports,
volumes: self.volumes,
env_vars: self.env_vars,
network: self.network,
network_aliases: self.network_aliases,
cpu_limit: self.cpu_limit,
memory_limit: self.memory_limit,
memory_swap_limit: self.memory_swap_limit,
cpu_shares: self.cpu_shares,
restart_policy: self.restart_policy,
health_check: self.health_check,
detach: self.detach,
snapshotter: self.snapshotter,
})
}
}

View File

@@ -0,0 +1,125 @@
# Engine
Here is an example of a well-implemented Rhai engine for the Git module:
## Example engine
```rust
// engine.rs
/// Register Nerdctl module functions with the Rhai engine
pub fn create_rhai_engine() -> Engine {
let mut engine = Engine::new();
register_nerdctl_module(&mut engine)?;
register_nerdctl_types(&mut engine)?;
engine
}
pub fn register_nerdctl_module(engine: &mut Engine) -> Result<(), Box<EvalAltResult>> {
// Register Container constructor
engine.register_fn("nerdctl_container_new", container_new);
engine.register_fn("nerdctl_container_from_image", container_from_image);
// Register Container instance methods
engine.register_fn("reset", container_reset);
engine.register_fn("with_port", container_with_port);
engine.register_fn("with_volume", container_with_volume);
engine.register_fn("with_env", container_with_env);
engine.register_fn("with_network", container_with_network);
engine.register_fn("with_network_alias", container_with_network_alias);
engine.register_fn("with_cpu_limit", container_with_cpu_limit);
engine.register_fn("with_memory_limit", container_with_memory_limit);
engine.register_fn("with_restart_policy", container_with_restart_policy);
engine.register_fn("with_health_check", container_with_health_check);
engine.register_fn("with_ports", container_with_ports);
engine.register_fn("with_volumes", container_with_volumes);
engine.register_fn("with_envs", container_with_envs);
engine.register_fn("with_network_aliases", container_with_network_aliases);
engine.register_fn("with_memory_swap_limit", container_with_memory_swap_limit);
engine.register_fn("with_cpu_shares", container_with_cpu_shares);
engine.register_fn("with_health_check_options", container_with_health_check_options);
engine.register_fn("with_snapshotter", container_with_snapshotter);
engine.register_fn("with_detach", container_with_detach);
engine.register_fn("build", container_build);
engine.register_fn("start", container_start);
engine.register_fn("stop", container_stop);
engine.register_fn("remove", container_remove);
engine.register_fn("exec", container_exec);
engine.register_fn("logs", container_logs);
engine.register_fn("copy", container_copy);
// Register legacy container functions (for backward compatibility)
engine.register_fn("nerdctl_run", nerdctl_run);
engine.register_fn("nerdctl_run_with_name", nerdctl_run_with_name);
engine.register_fn("nerdctl_run_with_port", nerdctl_run_with_port);
engine.register_fn("new_run_options", new_run_options);
engine.register_fn("nerdctl_exec", nerdctl_exec);
engine.register_fn("nerdctl_copy", nerdctl_copy);
engine.register_fn("nerdctl_stop", nerdctl_stop);
engine.register_fn("nerdctl_remove", nerdctl_remove);
engine.register_fn("nerdctl_list", nerdctl_list);
engine.register_fn("nerdctl_logs", nerdctl_logs);
// Register image functions
engine.register_fn("nerdctl_images", nerdctl_images);
engine.register_fn("nerdctl_image_remove", nerdctl_image_remove);
engine.register_fn("nerdctl_image_push", nerdctl_image_push);
engine.register_fn("nerdctl_image_tag", nerdctl_image_tag);
engine.register_fn("nerdctl_image_pull", nerdctl_image_pull);
engine.register_fn("nerdctl_image_commit", nerdctl_image_commit);
engine.register_fn("nerdctl_image_build", nerdctl_image_build);
Ok(())
}
/// Register Nerdctl module types with the Rhai engine
fn register_nerdctl_types(engine: &mut Engine) -> Result<(), Box<EvalAltResult>> {
// Register Container type
engine.register_type_with_name::<Container>("NerdctlContainer");
// Register getters for Container properties
engine.register_get("name", |container: &mut Container| container.name.clone());
engine.register_get("container_id", |container: &mut Container| {
match &container.container_id {
Some(id) => id.clone(),
None => "".to_string(),
}
});
engine.register_get("image", |container: &mut Container| {
match &container.image {
Some(img) => img.clone(),
None => "".to_string(),
}
});
engine.register_get("ports", |container: &mut Container| {
let mut array = Array::new();
for port in &container.ports {
array.push(Dynamic::from(port.clone()));
}
array
});
engine.register_get("volumes", |container: &mut Container| {
let mut array = Array::new();
for volume in &container.volumes {
array.push(Dynamic::from(volume.clone()));
}
array
});
engine.register_get("detach", |container: &mut Container| container.detach);
// Register Image type and methods
engine.register_type_with_name::<Image>("NerdctlImage");
// Register getters for Image properties
engine.register_get("id", |img: &mut Image| img.id.clone());
engine.register_get("repository", |img: &mut Image| img.repository.clone());
engine.register_get("tag", |img: &mut Image| img.tag.clone());
engine.register_get("size", |img: &mut Image| img.size.clone());
engine.register_get("created", |img: &mut Image| img.created.clone());
Ok(())
}
```

View File

@@ -0,0 +1,186 @@
# Common Errors in Rhai Wrappers and How to Fix Them
When creating Rhai wrappers for Rust functions, you might encounter several common errors. Here's how to address them:
## 1. `rhai_fn` Attribute Errors
```
error: cannot find attribute `rhai_fn` in this scope
```
**Solution**: Do not use the `#[rhai_fn]` attribute. Instead, register functions directly in the engine:
```rust
// INCORRECT:
#[rhai_fn(name = "pull_repository")]
pub fn pull_repository(repo: &mut GitRepo) -> Dynamic { ... }
// CORRECT:
pub fn pull_repository(repo: &mut GitRepo) -> Dynamic { ... }
// Then register in engine.rs:
engine.register_fn("pull_repository", pull_repository);
```
## 2. Function Visibility Errors
```
error[E0603]: function `create_rhai_engine` is private
```
**Solution**: Make sure to declare functions as `pub` when they need to be accessed from other modules:
```rust
// INCORRECT:
fn create_rhai_engine() -> Engine { ... }
// CORRECT:
pub fn create_rhai_engine() -> Engine { ... }
```
## 3. Type Errors with String vs &str
```
error[E0308]: `match` arms have incompatible types
```
**Solution**: Ensure consistent return types in match arms. When one arm returns a string literal (`&str`) and another returns a `String`, convert them to be consistent:
```rust
// INCORRECT:
match r.pull() {
Ok(_) => "Successfully pulled changes",
Err(err) => {
let error_msg = format!("Error pulling changes: {}", err);
error_msg // This is a String, not matching the &str above
}
}
// CORRECT - Option 1: Convert &str to String
match r.pull() {
Ok(_) => String::from("Successfully pulled changes"),
Err(err) => format!("Error pulling changes: {}", err)
}
// CORRECT - Option 2: Use String::from for all string literals
match r.pull() {
Ok(_) => String::from("Successfully pulled changes"),
Err(err) => {
let error_msg = format!("Error pulling changes: {}", err);
error_msg
}
}
```
## 4. Lifetime Errors
```
error: lifetime may not live long enough
```
**Solution**: When returning references from closures, you need to ensure the lifetime is valid. For path operations, convert to owned strings:
```rust
// INCORRECT:
repo_clone.wrap(|r| r.path())
// CORRECT:
repo_clone.wrap(|r| r.path().to_string())
```
## 5. Sized Trait Errors
```
error[E0277]: the size for values of type `Self` cannot be known at compilation time
```
**Solution**: Add a `Sized` bound to the `Self` type in trait definitions:
```rust
// INCORRECT:
trait RhaiWrapper {
fn wrap<F, R>(&self, f: F) -> Dynamic
where
F: FnOnce(Self) -> R,
R: ToRhai;
}
// CORRECT:
trait RhaiWrapper {
fn wrap<F, R>(&self, f: F) -> Dynamic
where
F: FnOnce(Self) -> R,
R: ToRhai,
Self: Sized;
}
```
## 6. Unused Imports
```
warning: unused imports: `Engine`, `EvalAltResult`, `FLOAT`, `INT`, and `plugin::*`
```
**Solution**: Remove unused imports to clean up your code:
```rust
// INCORRECT:
use rhai::{Engine, EvalAltResult, plugin::*, FLOAT, INT, Dynamic, Map, Array};
// CORRECT - only keep what you use:
use rhai::{Dynamic, Array};
```
## 7. Overuse of Dynamic Types
```
error[E0277]: the trait bound `Vec<Dynamic>: generic_wrapper::ToRhai` is not satisfied
```
**Solution**: Use proper static typing instead of Dynamic types whenever possible. This improves type safety and makes the code more maintainable:
```rust
// INCORRECT: Returning Dynamic for everything
pub fn list_repositories(tree: &mut GitTree) -> Dynamic {
let tree_clone = tree.clone();
tree_clone.wrap(|t| {
match t.list() {
Ok(repos) => repos,
Err(err) => vec![format!("Error listing repositories: {}", err)]
}
})
}
// CORRECT: Using proper Result types
pub fn list_repositories(tree: &mut GitTree) -> Result<Vec<String>, Box<EvalAltResult>> {
let tree_clone = tree.clone();
tree_clone.list().map_err(|err| {
Box::new(EvalAltResult::ErrorRuntime(
format!("Error listing repositories: {}", err).into(),
rhai::Position::NONE
))
})
}
```
## 8. Improper Error Handling
```
error[E0277]: the trait bound `for<'a> fn(&'a mut Engine) -> Result<(), Box<EvalAltResult>> {wrapper::register_git_module}: RhaiNativeFunc<_, _, _, _, _>` is not satisfied
```
**Solution**: When registering functions that return Result types, make sure they are properly handled:
```rust
// INCORRECT: Trying to register a function that returns Result<(), Box<EvalAltResult>>
engine.register_fn("register_git_module", wrapper::register_git_module);
// CORRECT: Wrap the function to handle the Result
engine.register_fn("register_git_module", |engine: &mut Engine| {
match wrapper::register_git_module(engine) {
Ok(_) => Dynamic::from(true),
Err(err) => Dynamic::from(format!("Error: {}", err))
}
});
```
Remember to adapt these solutions to your specific code context. The key is to maintain type consistency, proper visibility, correct lifetime management, and appropriate static typing.

View File

@@ -0,0 +1,40 @@
## Example Rhai Script
Now, given the source code you wrapped using Rhai executable functions, write an example Rhai script that uses those functions.
### Example example rhai script
```rhai
// example.rhai
// Create a new GitTree instance
let git_tree = new_git_tree("/Users/timurgordon/code");
print("\nCreated GitTree for: /Users/timurgordon/code");
// List repositories in the tree
let repos = list_repositories(git_tree);
print("Found " + repos.len() + " repositories");
if repos.len() > 0 {
print("First repository: " + repos[0]);
// Get the repository
let repo_array = get_repositories(git_tree, repos[0]);
if repo_array.len() > 0 {
let repo = repo_array[0];
print("\nRepository path: " + get_repo_path(repo));
// Check if the repository has changes
let has_changes = has_changes(repo);
print("Has changes: " + has_changes);
// Try to pull the repository
print("\nTrying to pull repository...");
let pull_result = pull_repository(repo);
print("Pull result: " + pull_result);
}
}
print("\nResult: Git operations completed successfully");
42 // Return value
```

View File

@@ -0,0 +1,473 @@
# Wrapper
Here is an example of a well-implemented Rhai wrapper for the Git module:
## Example wrapper
```rust
// wrapper.rs
//! Rhai wrappers for Nerdctl module functions
//!
//! This module provides Rhai wrappers for the functions in the Nerdctl module.
use rhai::{Engine, EvalAltResult, Array, Dynamic, Map};
use crate::virt::nerdctl::{self, NerdctlError, Image, Container};
use crate::process::CommandResult;
// Helper functions for error conversion with improved context
fn nerdctl_error_to_rhai_error<T>(result: Result<T, NerdctlError>) -> Result<T, Box<EvalAltResult>> {
result.map_err(|e| {
// Create a more detailed error message based on the error type
let error_message = match &e {
NerdctlError::CommandExecutionFailed(io_err) => {
format!("Failed to execute nerdctl command: {}. This may indicate nerdctl is not installed or not in PATH.", io_err)
},
NerdctlError::CommandFailed(msg) => {
format!("Nerdctl command failed: {}. Check container status and logs for more details.", msg)
},
NerdctlError::JsonParseError(msg) => {
format!("Failed to parse nerdctl JSON output: {}. This may indicate an incompatible nerdctl version.", msg)
},
NerdctlError::ConversionError(msg) => {
format!("Data conversion error: {}. This may indicate unexpected output format from nerdctl.", msg)
},
NerdctlError::Other(msg) => {
format!("Nerdctl error: {}. This is an unexpected error.", msg)
},
};
Box::new(EvalAltResult::ErrorRuntime(
error_message.into(),
rhai::Position::NONE
))
})
}
//
// Container Builder Pattern Implementation
//
/// Create a new Container
pub fn container_new(name: &str) -> Result<Container, Box<EvalAltResult>> {
nerdctl_error_to_rhai_error(Container::new(name))
}
/// Create a Container from an image
pub fn container_from_image(name: &str, image: &str) -> Result<Container, Box<EvalAltResult>> {
nerdctl_error_to_rhai_error(Container::from_image(name, image))
}
/// Reset the container configuration to defaults while keeping the name and image
pub fn container_reset(container: Container) -> Container {
container.reset()
}
/// Add a port mapping to a Container
pub fn container_with_port(container: Container, port: &str) -> Container {
container.with_port(port)
}
/// Add a volume mount to a Container
pub fn container_with_volume(container: Container, volume: &str) -> Container {
container.with_volume(volume)
}
/// Add an environment variable to a Container
pub fn container_with_env(container: Container, key: &str, value: &str) -> Container {
container.with_env(key, value)
}
/// Set the network for a Container
pub fn container_with_network(container: Container, network: &str) -> Container {
container.with_network(network)
}
/// Add a network alias to a Container
pub fn container_with_network_alias(container: Container, alias: &str) -> Container {
container.with_network_alias(alias)
}
/// Set CPU limit for a Container
pub fn container_with_cpu_limit(container: Container, cpus: &str) -> Container {
container.with_cpu_limit(cpus)
}
/// Set memory limit for a Container
pub fn container_with_memory_limit(container: Container, memory: &str) -> Container {
container.with_memory_limit(memory)
}
/// Set restart policy for a Container
pub fn container_with_restart_policy(container: Container, policy: &str) -> Container {
container.with_restart_policy(policy)
}
/// Set health check for a Container
pub fn container_with_health_check(container: Container, cmd: &str) -> Container {
container.with_health_check(cmd)
}
/// Add multiple port mappings to a Container
pub fn container_with_ports(mut container: Container, ports: Array) -> Container {
for port in ports.iter() {
if port.is_string() {
let port_str = port.clone().cast::<String>();
container = container.with_port(&port_str);
}
}
container
}
/// Add multiple volume mounts to a Container
pub fn container_with_volumes(mut container: Container, volumes: Array) -> Container {
for volume in volumes.iter() {
if volume.is_string() {
let volume_str = volume.clone().cast::<String>();
container = container.with_volume(&volume_str);
}
}
container
}
/// Add multiple environment variables to a Container
pub fn container_with_envs(mut container: Container, env_map: Map) -> Container {
for (key, value) in env_map.iter() {
if value.is_string() {
let value_str = value.clone().cast::<String>();
container = container.with_env(&key, &value_str);
}
}
container
}
/// Add multiple network aliases to a Container
pub fn container_with_network_aliases(mut container: Container, aliases: Array) -> Container {
for alias in aliases.iter() {
if alias.is_string() {
let alias_str = alias.clone().cast::<String>();
container = container.with_network_alias(&alias_str);
}
}
container
}
/// Set memory swap limit for a Container
pub fn container_with_memory_swap_limit(container: Container, memory_swap: &str) -> Container {
container.with_memory_swap_limit(memory_swap)
}
/// Set CPU shares for a Container
pub fn container_with_cpu_shares(container: Container, shares: &str) -> Container {
container.with_cpu_shares(shares)
}
/// Set health check with options for a Container
pub fn container_with_health_check_options(
container: Container,
cmd: &str,
interval: Option<&str>,
timeout: Option<&str>,
retries: Option<i64>,
start_period: Option<&str>
) -> Container {
// Convert i64 to u32 for retries
let retries_u32 = retries.map(|r| r as u32);
container.with_health_check_options(cmd, interval, timeout, retries_u32, start_period)
}
/// Set snapshotter for a Container
pub fn container_with_snapshotter(container: Container, snapshotter: &str) -> Container {
container.with_snapshotter(snapshotter)
}
/// Set detach mode for a Container
pub fn container_with_detach(container: Container, detach: bool) -> Container {
container.with_detach(detach)
}
/// Build and run the Container
///
/// This function builds and runs the container using the configured options.
/// It provides detailed error information if the build fails.
pub fn container_build(container: Container) -> Result<Container, Box<EvalAltResult>> {
// Get container details for better error reporting
let container_name = container.name.clone();
let image = container.image.clone().unwrap_or_else(|| "none".to_string());
let ports = container.ports.clone();
let volumes = container.volumes.clone();
let env_vars = container.env_vars.clone();
// Try to build the container
let build_result = container.build();
// Handle the result with improved error context
match build_result {
Ok(built_container) => {
// Container built successfully
Ok(built_container)
},
Err(err) => {
// Add more context to the error
let enhanced_error = match err {
NerdctlError::CommandFailed(msg) => {
// Provide more detailed error information
let mut enhanced_msg = format!("Failed to build container '{}' from image '{}': {}",
container_name, image, msg);
// Add information about configured options that might be relevant
if !ports.is_empty() {
enhanced_msg.push_str(&format!("\nConfigured ports: {:?}", ports));
}
if !volumes.is_empty() {
enhanced_msg.push_str(&format!("\nConfigured volumes: {:?}", volumes));
}
if !env_vars.is_empty() {
enhanced_msg.push_str(&format!("\nConfigured environment variables: {:?}", env_vars));
}
// Add suggestions for common issues
if msg.contains("not found") || msg.contains("no such image") {
enhanced_msg.push_str("\nSuggestion: The specified image may not exist or may not be pulled yet. Try pulling the image first with nerdctl_image_pull().");
} else if msg.contains("port is already allocated") {
enhanced_msg.push_str("\nSuggestion: One of the specified ports is already in use. Try using a different port or stopping the container using that port.");
} else if msg.contains("permission denied") {
enhanced_msg.push_str("\nSuggestion: Permission issues detected. Check if you have the necessary permissions to create containers or access the specified volumes.");
}
NerdctlError::CommandFailed(enhanced_msg)
},
_ => err
};
nerdctl_error_to_rhai_error(Err(enhanced_error))
}
}
}
/// Start the Container and verify it's running
///
/// This function starts the container and verifies that it's actually running.
/// It returns detailed error information if the container fails to start or
/// if it starts but stops immediately.
pub fn container_start(container: &mut Container) -> Result<CommandResult, Box<EvalAltResult>> {
// Get container details for better error reporting
let container_name = container.name.clone();
let container_id = container.container_id.clone().unwrap_or_else(|| "unknown".to_string());
// Try to start the container
let start_result = container.start();
// Handle the result with improved error context
match start_result {
Ok(result) => {
// Container started successfully
Ok(result)
},
Err(err) => {
// Add more context to the error
let enhanced_error = match err {
NerdctlError::CommandFailed(msg) => {
// Check if this is a "container already running" error, which is not really an error
if msg.contains("already running") {
return Ok(CommandResult {
stdout: format!("Container {} is already running", container_name),
stderr: "".to_string(),
success: true,
code: 0,
});
}
// Try to get more information about why the container might have failed to start
let mut enhanced_msg = format!("Failed to start container '{}' (ID: {}): {}",
container_name, container_id, msg);
// Try to check if the image exists
if let Some(image) = &container.image {
enhanced_msg.push_str(&format!("\nContainer was using image: {}", image));
}
NerdctlError::CommandFailed(enhanced_msg)
},
_ => err
};
nerdctl_error_to_rhai_error(Err(enhanced_error))
}
}
}
/// Stop the Container
pub fn container_stop(container: &mut Container) -> Result<CommandResult, Box<EvalAltResult>> {
nerdctl_error_to_rhai_error(container.stop())
}
/// Remove the Container
pub fn container_remove(container: &mut Container) -> Result<CommandResult, Box<EvalAltResult>> {
nerdctl_error_to_rhai_error(container.remove())
}
/// Execute a command in the Container
pub fn container_exec(container: &mut Container, command: &str) -> Result<CommandResult, Box<EvalAltResult>> {
nerdctl_error_to_rhai_error(container.exec(command))
}
/// Get container logs
pub fn container_logs(container: &mut Container) -> Result<CommandResult, Box<EvalAltResult>> {
// Get container details for better error reporting
let container_name = container.name.clone();
let container_id = container.container_id.clone().unwrap_or_else(|| "unknown".to_string());
// Use the nerdctl::logs function
let logs_result = nerdctl::logs(&container_id);
match logs_result {
Ok(result) => {
Ok(result)
},
Err(err) => {
// Add more context to the error
let enhanced_error = NerdctlError::CommandFailed(
format!("Failed to get logs for container '{}' (ID: {}): {}",
container_name, container_id, err)
);
nerdctl_error_to_rhai_error(Err(enhanced_error))
}
}
}
/// Copy files between the Container and local filesystem
pub fn container_copy(container: &mut Container, source: &str, dest: &str) -> Result<CommandResult, Box<EvalAltResult>> {
nerdctl_error_to_rhai_error(container.copy(source, dest))
}
/// Create a new Map with default run options
pub fn new_run_options() -> Map {
let mut map = Map::new();
map.insert("name".into(), Dynamic::UNIT);
map.insert("detach".into(), Dynamic::from(true));
map.insert("ports".into(), Dynamic::from(Array::new()));
map.insert("snapshotter".into(), Dynamic::from("native"));
map
}
//
// Container Function Wrappers
//
/// Wrapper for nerdctl::run
///
/// Run a container from an image.
pub fn nerdctl_run(image: &str) -> Result<CommandResult, Box<EvalAltResult>> {
nerdctl_error_to_rhai_error(nerdctl::run(image, None, true, None, None))
}
/// Run a container with a name
pub fn nerdctl_run_with_name(image: &str, name: &str) -> Result<CommandResult, Box<EvalAltResult>> {
nerdctl_error_to_rhai_error(nerdctl::run(image, Some(name), true, None, None))
}
/// Run a container with a port mapping
pub fn nerdctl_run_with_port(image: &str, name: &str, port: &str) -> Result<CommandResult, Box<EvalAltResult>> {
let ports = vec![port];
nerdctl_error_to_rhai_error(nerdctl::run(image, Some(name), true, Some(&ports), None))
}
/// Wrapper for nerdctl::exec
///
/// Execute a command in a container.
pub fn nerdctl_exec(container: &str, command: &str) -> Result<CommandResult, Box<EvalAltResult>> {
nerdctl_error_to_rhai_error(nerdctl::exec(container, command))
}
/// Wrapper for nerdctl::copy
///
/// Copy files between container and local filesystem.
pub fn nerdctl_copy(source: &str, dest: &str) -> Result<CommandResult, Box<EvalAltResult>> {
nerdctl_error_to_rhai_error(nerdctl::copy(source, dest))
}
/// Wrapper for nerdctl::stop
///
/// Stop a container.
pub fn nerdctl_stop(container: &str) -> Result<CommandResult, Box<EvalAltResult>> {
nerdctl_error_to_rhai_error(nerdctl::stop(container))
}
/// Wrapper for nerdctl::remove
///
/// Remove a container.
pub fn nerdctl_remove(container: &str) -> Result<CommandResult, Box<EvalAltResult>> {
nerdctl_error_to_rhai_error(nerdctl::remove(container))
}
/// Wrapper for nerdctl::list
///
/// List containers.
pub fn nerdctl_list(all: bool) -> Result<CommandResult, Box<EvalAltResult>> {
nerdctl_error_to_rhai_error(nerdctl::list(all))
}
/// Wrapper for nerdctl::logs
///
/// Get container logs.
pub fn nerdctl_logs(container: &str) -> Result<CommandResult, Box<EvalAltResult>> {
nerdctl_error_to_rhai_error(nerdctl::logs(container))
}
//
// Image Function Wrappers
//
/// Wrapper for nerdctl::images
///
/// List images in local storage.
pub fn nerdctl_images() -> Result<CommandResult, Box<EvalAltResult>> {
nerdctl_error_to_rhai_error(nerdctl::images())
}
/// Wrapper for nerdctl::image_remove
///
/// Remove one or more images.
pub fn nerdctl_image_remove(image: &str) -> Result<CommandResult, Box<EvalAltResult>> {
nerdctl_error_to_rhai_error(nerdctl::image_remove(image))
}
/// Wrapper for nerdctl::image_push
///
/// Push an image to a registry.
pub fn nerdctl_image_push(image: &str, destination: &str) -> Result<CommandResult, Box<EvalAltResult>> {
nerdctl_error_to_rhai_error(nerdctl::image_push(image, destination))
}
/// Wrapper for nerdctl::image_tag
///
/// Add an additional name to a local image.
pub fn nerdctl_image_tag(image: &str, new_name: &str) -> Result<CommandResult, Box<EvalAltResult>> {
nerdctl_error_to_rhai_error(nerdctl::image_tag(image, new_name))
}
/// Wrapper for nerdctl::image_pull
///
/// Pull an image from a registry.
pub fn nerdctl_image_pull(image: &str) -> Result<CommandResult, Box<EvalAltResult>> {
nerdctl_error_to_rhai_error(nerdctl::image_pull(image))
}
/// Wrapper for nerdctl::image_commit
///
/// Commit a container to an image.
pub fn nerdctl_image_commit(container: &str, image_name: &str) -> Result<CommandResult, Box<EvalAltResult>> {
nerdctl_error_to_rhai_error(nerdctl::image_commit(container, image_name))
}
/// Wrapper for nerdctl::image_build
///
/// Build an image using a Dockerfile.
pub fn nerdctl_image_build(tag: &str, context_path: &str) -> Result<CommandResult, Box<EvalAltResult>> {
nerdctl_error_to_rhai_error(nerdctl::image_build(tag, context_path))
}
```

View File

@@ -0,0 +1,10 @@
[package]
name = "@{name}_rhai"
version = "0.1.0"
edition = "2021"
[dependencies]
rhai = "1.21.0"
serde = { version = "1.0", features = ["derive"] }
serde_json = "1.0"
sal = { path = "../../../" }

View File

@@ -0,0 +1,12 @@
use rhai::{Engine, EvalAltResult, Map, Dynamic};
use crate::wrapper;
pub fn create_rhai_engine() -> Engine {
let mut engine = Engine::new();
@for function in functions
engine.register_fn("@{function}", wrapper::@{function});
@end
engine
}

View File

@@ -0,0 +1,40 @@
use std::{fs, path::Path};
use @{name}_rhai::create_rhai_engine;
fn main() -> Result<(), Box<dyn std::error::Error>> {
println!("=== Rhai Wrapper Example ===");
// Create a Rhai engine with functionality
let mut engine = create_rhai_engine();
println!("Successfully created Rhai engine");
// Get the path to the example.rhai script
let script_path = get_script_path()?;
println!("Loading script from: {}", script_path.display());
// Load the script content
let script = fs::read_to_string(&script_path)
.map_err(|e| format!("Failed to read script file: {}", e))?;
// Run the script
println!("\n=== Running Rhai script ===");
let result = engine.eval::<i64>(&script)
.map_err(|e| format!("Script execution error: {}", e))?;
println!("\nScript returned: {}", result);
println!("\nExample completed successfully!");
Ok(())
}
fn get_script_path() -> Result<std::path::PathBuf, String> {
// When running with cargo run --example, the script will be in the examples directory
let script_path = Path::new(env!("CARGO_MANIFEST_DIR"))
.join("examples")
.join("example.rhai");
if script_path.exists() {
Ok(script_path)
} else {
Err(format!("Could not find example.rhai script at {}", script_path.display()))
}
}

View File

@@ -0,0 +1,132 @@
use std::collections::HashMap;
use rhai::{Dynamic, Map, Array};
/// Local wrapper trait for sal::rhai::ToRhai to avoid orphan rule violations
pub trait ToRhai {
/// Convert to a Rhai Dynamic value
fn to_rhai(&self) -> Dynamic;
}
// Implementation of ToRhai for Dynamic
impl ToRhai for Dynamic {
fn to_rhai(&self) -> Dynamic {
self.clone()
}
}
/// Generic trait for wrapping Rust functions to be used with Rhai
pub trait RhaiWrapper {
/// Wrap a function that takes ownership of self
fn wrap_consuming<F, R>(self, f: F) -> Dynamic
where
Self: Sized + Clone,
F: FnOnce(Self) -> R,
R: ToRhai;
/// Wrap a function that takes a mutable reference to self
fn wrap_mut<F, R>(&mut self, f: F) -> Dynamic
where
Self: Sized + Clone,
F: FnOnce(&mut Self) -> R,
R: ToRhai;
/// Wrap a function that takes an immutable reference to self
fn wrap<F, R>(&self, f: F) -> Dynamic
where
Self: Sized + Clone,
F: FnOnce(&Self) -> R,
R: ToRhai;
}
/// Implementation of RhaiWrapper for any type
impl<T> RhaiWrapper for T {
fn wrap_consuming<F, R>(self, f: F) -> Dynamic
where
Self: Sized + Clone,
F: FnOnce(Self) -> R,
R: ToRhai,
{
let result = f(self);
result.to_rhai()
}
fn wrap_mut<F, R>(&mut self, f: F) -> Dynamic
where
Self: Sized + Clone,
F: FnOnce(&mut Self) -> R,
R: ToRhai,
{
let result = f(self);
result.to_rhai()
}
fn wrap<F, R>(&self, f: F) -> Dynamic
where
Self: Sized + Clone,
F: FnOnce(&Self) -> R,
R: ToRhai,
{
let result = f(self);
result.to_rhai()
}
}
/// Convert a Rhai Map to a Rust HashMap
pub fn map_to_hashmap(map: &Map) -> HashMap<String, String> {
let mut result = HashMap::new();
for (key, value) in map.iter() {
let k = key.clone().to_string();
let v = value.clone().to_string();
if !k.is_empty() && !v.is_empty() {
result.insert(k, v);
}
}
result
}
/// Convert a HashMap<String, String> to a Rhai Map
pub fn hashmap_to_map(map: &HashMap<String, String>) -> Map {
let mut result = Map::new();
for (key, value) in map.iter() {
result.insert(key.clone().into(), Dynamic::from(value.clone()));
}
result
}
/// Convert a Rhai Array to a Vec of strings
pub fn array_to_vec_string(array: &Array) -> Vec<String> {
array.iter()
.filter_map(|item| {
let s = item.clone().to_string();
if !s.is_empty() { Some(s) } else { None }
})
.collect()
}
/// Helper function to convert Dynamic to Option<String>
pub fn dynamic_to_string_option(value: &Dynamic) -> Option<String> {
if value.is_string() {
Some(value.clone().to_string())
} else {
None
}
}
/// Helper function to convert Dynamic to Option<u32>
pub fn dynamic_to_u32_option(value: &Dynamic) -> Option<u32> {
if value.is_int() {
Some(value.as_int().unwrap() as u32)
} else {
None
}
}
/// Helper function to convert Dynamic to Option<&str> with lifetime management
pub fn dynamic_to_str_option<'a>(value: &Dynamic, storage: &'a mut String) -> Option<&'a str> {
if value.is_string() {
*storage = value.clone().to_string();
Some(storage.as_str())
} else {
None
}
}

View File

@@ -0,0 +1,11 @@
// Re-export the utility modules
pub mod generic_wrapper;
pub mod wrapper;
pub mod engine;
// Re-export the utility traits and functions
pub use generic_wrapper::{RhaiWrapper, map_to_hashmap, array_to_vec_string,
dynamic_to_string_option, hashmap_to_map};
pub use engine::create_rhai_engine;
// The create_rhai_engine function is now in the engine module

View File

@@ -0,0 +1,47 @@
module escalayer
import freeflowuniverse.herolib.clients.openai
// ModelConfig defines the configuration for an AI model
pub struct ModelConfig {
pub mut:
name string
provider string
temperature f32
max_tokens int
}
// Call an AI model using OpenRouter
fn call_ai_model(prompt string, model ModelConfig)! string {
// Get OpenAI client (configured for OpenRouter)
mut client := get_openrouter_client()!
// Create the message for the AI
mut m := openai.Messages{
messages: [
openai.Message{
role: .system
content: 'You are a helpful assistant.'
},
openai.Message{
role: .user
content: prompt
}
]
}
// Call the AI model
res := client.chat_completion(
msgs: m,
model: model.name,
temperature: model.temperature,
max_completion_tokens: model.max_tokens
)!
// Extract the response content
if res.choices.len > 0 {
return res.choices[0].message.content
}
return error('No response from AI model')
}

View File

@@ -0,0 +1,23 @@
module escalayer
import freeflowuniverse.herolib.clients.openai
import freeflowuniverse.herolib.osal
import os
// Get an OpenAI client configured for OpenRouter
fn get_openrouter_client()! &openai.OpenAI {
osal.env_set(key: 'OPENROUTER_API_KEY', value: '')
// Get API key from environment variable
api_key := os.getenv('OPENROUTER_API_KEY')
if api_key == '' {
return error('OPENROUTER_API_KEY environment variable not set')
}
// Create OpenAI client with OpenRouter base URL
mut client := openai.get(
name: 'openrouter'
)!
return client
}

View File

@@ -0,0 +1,55 @@
module escalayer
// Task represents a complete AI task composed of multiple sequential unit tasks
pub struct Task {
pub mut:
name string
description string
unit_tasks []UnitTask
current_result string
}
// UnitTaskParams defines the parameters for creating a new unit task
@[params]
pub struct UnitTaskParams {
pub:
name string
prompt_function fn(string) string
callback_function fn(string)! string
base_model ?ModelConfig
retry_model ?ModelConfig
retry_count ?int
}
// Add a new unit task to the task
pub fn (mut t Task) new_unit_task(params UnitTaskParams) &UnitTask {
mut unit_task := UnitTask{
name: params.name
prompt_function: params.prompt_function
callback_function: params.callback_function
base_model: if base_model := params.base_model { base_model } else { default_base_model() }
retry_model: if retry_model := params.retry_model { retry_model } else { default_retry_model() }
retry_count: if retry_count := params.retry_count { retry_count } else { 3 }
}
t.unit_tasks << unit_task
return &t.unit_tasks[t.unit_tasks.len - 1]
}
// Initiate the task execution
pub fn (mut t Task) initiate(input string)! string {
mut current_input := input
for i, mut unit_task in t.unit_tasks {
println('Executing unit task ${i+1}/${t.unit_tasks.len}: ${unit_task.name}')
// Execute the unit task with the current input
result := unit_task.execute(current_input)!
// Update the current input for the next unit task
current_input = result
t.current_result = result
}
return t.current_result
}

View File

@@ -0,0 +1,70 @@
module escalayer
import freeflowuniverse.herolib.clients.openai
// UnitTask represents a single step in the task
pub struct UnitTask {
pub mut:
name string
prompt_function fn(string) string
callback_function fn(string)! string
base_model ModelConfig
retry_model ModelConfig
retry_count int
}
// Execute the unit task
pub fn (mut ut UnitTask) execute(input string)! string {
// Generate the prompt using the prompt function
prompt := ut.prompt_function(input)
// Try with the base model first
mut current_model := ut.base_model
mut attempts := 0
mut max_attempts := ut.retry_count + 1 // +1 for the initial attempt
mut absolute_max_attempts := 1 // Hard limit on total attempts
mut last_error := ''
for attempts < max_attempts && attempts < absolute_max_attempts {
attempts++
// If we've exhausted retries with the base model, switch to the retry model
if attempts > ut.retry_count {
println('Escalating to more powerful model: ${ut.retry_model.name}')
current_model = ut.retry_model
// Calculate remaining attempts but don't exceed absolute max
max_attempts = attempts + ut.retry_count
if max_attempts > absolute_max_attempts {
max_attempts = absolute_max_attempts
}
}
println('Attempt ${attempts} with model ${current_model.name}')
// Prepare the prompt with error feedback if this is a retry
mut current_prompt := prompt
if last_error != '' {
current_prompt = 'Previous attempt failed with error: ${last_error}\n\n${prompt}'
}
// Call the AI model
response := call_ai_model(current_prompt, current_model) or {
println('AI call failed: ${err}')
last_error = err.str()
continue // Try again
}
// Process the response with the callback function
result := ut.callback_function(response) or {
// If callback returns an error, retry with the error message
println('Callback returned error: ${err}')
last_error = err.str()
continue // Try again
}
// If we get here, the callback was successful
return result
}
return error('Failed to execute unit task after ${attempts} attempts. Last error: ${last_error}')
}