Refactor the herolib repo:

- Removed the unused files
- Updated the README
- Added all needed scripts in /scripts dir
- Update script paths in CI configuration
- Update script paths in Go code
- Move installation scripts to scripts directory
- Change script path from ./install_v.sh to ./scripts/install_v.sh
This commit is contained in:
Mahmoud-Emad
2025-11-17 15:04:49 +02:00
parent 2ab0dfa6b8
commit 49868a18e1
445 changed files with 18 additions and 40974 deletions

View File

@@ -27,7 +27,7 @@ jobs:
uses: actions/checkout@v4 uses: actions/checkout@v4
- name: Setup Vlang - name: Setup Vlang
run: ./install_v.sh run: ./scripts/install_v.sh
- name: Generate documentation - name: Generate documentation
run: | run: |

View File

@@ -24,7 +24,7 @@ jobs:
run: | run: |
# Updating man-db takes a long time on every run. We don't need it # Updating man-db takes a long time on every run. We don't need it
sudo apt-get remove -y --purge man-db sudo apt-get remove -y --purge man-db
./install_v.sh ./scripts/install_v.sh
- name: Setup Herolib from current branch - name: Setup Herolib from current branch
run: | run: |

View File

@@ -1,5 +0,0 @@
when fixing or creating code, refer to the following hints:
@aiprompts/vlang_herolib_core.md

View File

@@ -1,6 +0,0 @@
{
"context": "Workspace",
"bindings": {
"cmd-r": ["task::Spawn", { "task_name": "ET", "reveal_target": "center" }]
}
}

View File

@@ -1,47 +0,0 @@
[
{
"label": "ET",
"command": "for i in {1..5}; do echo \"Hello $i/5\"; sleep 1; done",
//"args": [],
// Env overrides for the command, will be appended to the terminal's environment from the settings.
"env": { "foo": "bar" },
// Current working directory to spawn the command into, defaults to current project root.
//"cwd": "/path/to/working/directory",
// Whether to use a new terminal tab or reuse the existing one to spawn the process, defaults to `false`.
"use_new_terminal": true,
// Whether to allow multiple instances of the same task to be run, or rather wait for the existing ones to finish, defaults to `false`.
"allow_concurrent_runs": false,
// What to do with the terminal pane and tab, after the command was started:
// * `always` — always show the task's pane, and focus the corresponding tab in it (default)
// * `no_focus` — always show the task's pane, add the task's tab in it, but don't focus it
// * `never` — do not alter focus, but still add/reuse the task's tab in its pane
"reveal": "always",
// What to do with the terminal pane and tab, after the command has finished:
// * `never` — Do nothing when the command finishes (default)
// * `always` — always hide the terminal tab, hide the pane also if it was the last tab in it
// * `on_success` — hide the terminal tab on task success only, otherwise behaves similar to `always`
"hide": "never",
// Which shell to use when running a task inside the terminal.
// May take 3 values:
// 1. (default) Use the system's default terminal configuration in /etc/passwd
// "shell": "system"
// 2. A program:
// "shell": {
// "program": "sh"
// }
// 3. A program with arguments:
// "shell": {
// "with_arguments": {
// "program": "/bin/bash",
// "args": ["--login"]
// }
// }
"shell": "system",
// Whether to show the task line in the output of the spawned task, defaults to `true`.
"show_summary": true,
// Whether to show the command line in the output of the spawned task, defaults to `true`.
// "show_output": true,
// Represents the tags for inline runnable indicators, or spawning multiple tasks at once.
"tags": ["DODO"]
}
]

View File

@@ -24,7 +24,7 @@ Thank you for your interest in contributing to Herolib! This document provides g
For developers, you can use the automated installation script: For developers, you can use the automated installation script:
```bash ```bash
curl 'https://raw.githubusercontent.com/incubaid/herolib/refs/heads/development/install_v.sh' > /tmp/install_v.sh curl 'https://raw.githubusercontent.com/incubaid/herolib/refs/heads/development/scripts/install_v.sh' > /tmp/install_v.sh
bash /tmp/install_v.sh --analyzer --herolib bash /tmp/install_v.sh --analyzer --herolib
# IMPORTANT: Start a new shell after installation for paths to be set correctly # IMPORTANT: Start a new shell after installation for paths to be set correctly
``` ```

View File

@@ -35,11 +35,11 @@ The Hero tool can be used to work with git, build documentation, interact with H
For development purposes, use the automated installation script: For development purposes, use the automated installation script:
```bash ```bash
curl 'https://raw.githubusercontent.com/incubaid/herolib/refs/heads/development/install_v.sh' > /tmp/install_v.sh curl 'https://raw.githubusercontent.com/incubaid/herolib/refs/heads/development/scripts/install_v.sh' > /tmp/install_v.sh
bash /tmp/install_v.sh --analyzer --herolib bash /tmp/install_v.sh --analyzer --herolib
#do not forget to do the following this makes sure vtest and vrun exists #do not forget to do the following this makes sure vtest and vrun exists
cd ~/code/github/incubaid/herolib cd ~/code/github/incubaid/herolib/scripts
v install_herolib.vsh v install_herolib.vsh
# IMPORTANT: Start a new shell after installation for paths to be set correctly # IMPORTANT: Start a new shell after installation for paths to be set correctly
@@ -51,7 +51,7 @@ v install_herolib.vsh
``` ```
V & HeroLib Installer Script V & HeroLib Installer Script
Usage: ~/code/github/incubaid/herolib/install_v.sh [options] Usage: ~/code/github/incubaid/herolib/scripts/install_v.sh [options]
Options: Options:
-h, --help Show this help message -h, --help Show this help message
@@ -61,12 +61,12 @@ Options:
--herolib Install our herolib --herolib Install our herolib
Examples: Examples:
~/code/github/incubaid/herolib/install_v.sh ~/code/github/incubaid/herolib/scripts/install_v.sh
~/code/github/incubaid/herolib/install_v.sh --reset ~/code/github/incubaid/herolib/scripts/install_v.sh --reset
~/code/github/incubaid/herolib/install_v.sh --remove ~/code/github/incubaid/herolib/scripts/install_v.sh --remove
~/code/github/incubaid/herolib/install_v.sh --analyzer ~/code/github/incubaid/herolib/scripts/install_v.sh --analyzer
~/code/github/incubaid/herolib/install_v.sh --herolib ~/code/github/incubaid/herolib/scripts/install_v.sh --herolib
~/code/github/incubaid/herolib/install_v.sh --reset --analyzer # Fresh install of both ~/code/github/incubaid/herolib/scripts/install_v.sh --reset --analyzer # Fresh install of both
``` ```
## Features ## Features
@@ -175,46 +175,3 @@ To generate documentation locally:
cd ~/code/github/incubaid/herolib cd ~/code/github/incubaid/herolib
bash doc.sh bash doc.sh
``` ```
## Export Behavior - Cross-Collection Assets
When exporting collections, Atlas now automatically handles cross-collection references:
### Pages
If a page in Collection A links to a page in Collection B:
- The target page is copied to Collection A's export directory
- Filename is renamed to avoid conflicts: `collectionb_pagename.md`
- The link is updated to reference the local file
### Images and Files
Images and files referenced from other collections are:
- Copied to the `img/` subdirectory of the exporting collection
- Renamed with cross-collection prefix: `othercol_filename.ext`
- Image/file references in pages are updated to the new paths
### Result
The exported collection directory is **self-contained**:
```
destination/
collectiona/
.collection
page1.md
collectionb_intro.md # Copied from Collection B
img/
logo.png # Local image
collectionc_logo.png # Copied from Collection C
```
### Metadata Export
Metadata is now exported separately using the `destination_meta` parameter:
```heroscript
!!atlas.export
destination: './output'
destination_meta: './metadata' # Saves JSON metadata files here
include: true
redis: true
```
This exports collection metadata to: `./metadata/collection1.json`, `./metadata/collection2.json`, etc.

View File

@@ -16,4 +16,4 @@ NC='\033[0m' # No Color
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
cd "$SCRIPT_DIR" cd "$SCRIPT_DIR"
/workspace/herolib/install_v.sh /workspace/herolib/scripts/install_v.sh

View File

@@ -5,8 +5,8 @@ SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
cd "$SCRIPT_DIR" cd "$SCRIPT_DIR"
# Copy installation files # Copy installation files
cp ../../install_v.sh ./scripts/install_v.sh cp ../../scripts/install_v.sh ./scripts/install_v.sh
cp ../../install_herolib.vsh ./scripts/install_herolib.vsh cp ../../scripts/install_herolib.vsh ./scripts/install_herolib.vsh
# Docker image and container names # Docker image and container names
DOCKER_IMAGE_NAME="herolib" DOCKER_IMAGE_NAME="herolib"

View File

@@ -1,31 +0,0 @@
{
"folders": [
{
"path": "."
},
],
"settings": {
"extensions.ignoreRecommendations": false
},
"extensions": {
"unwantedRecommendations": [],
"recommendations": [
"saoudrizwan.claude-dev",
"shakram02.bash-beautify",
"ms-vsliveshare.vsliveshare",
"yzhang.markdown-all-in-one",
"elesoho.vscode-markdown-paste-image",
"ms-vscode-remote.remote-ssh",
"ms-vscode-remote.remote-ssh-edit",
"ms-vscode.remote-explorer",
"ms-vscode.remote-repositories",
"charliermarsh.ruff",
"bmalehorn.shell-syntax",
"qwtel.sqlite-viewer",
"simonsiefke.svg-preview",
"gruntfuggly.todo-tree",
"vosca.vscode-v-analyzer",
"tomoki1207.pdf"
]
}
}

View File

@@ -57,10 +57,10 @@ pub fn (mut node Node) hero_install(args HeroInstallArgs) ! {
mut todo := []string{} mut todo := []string{}
if !args.compile { if !args.compile {
todo << 'curl https://raw.githubusercontent.com/incubaid/herolib/refs/heads/development/install_hero.sh > /tmp/install.sh' todo << 'curl https://raw.githubusercontent.com/incubaid/herolib/refs/heads/development/scripts/install_hero.sh > /tmp/install.sh'
todo << 'bash /tmp/install.sh' todo << 'bash /tmp/install.sh'
} else { } else {
todo << "curl 'https://raw.githubusercontent.com/incubaid/herolib/refs/heads/development/install_v.sh' > /tmp/install_v.sh" todo << "curl 'https://raw.githubusercontent.com/incubaid/herolib/refs/heads/development/scripts/install_v.sh' > /tmp/install_v.sh"
if args.v_analyzer { if args.v_analyzer {
todo << 'bash /tmp/install_v.sh --analyzer --herolib ' todo << 'bash /tmp/install_v.sh --analyzer --herolib '
} else { } else {

View File

@@ -101,7 +101,7 @@ pub fn compile(args InstallArgs) ! {
cmd := " cmd := "
cd /tmp cd /tmp
export TERM=xterm export TERM=xterm
curl 'https://raw.githubusercontent.com/incubaid/herolib/refs/heads/development/install_v.sh' > /tmp/install_v.sh curl 'https://raw.githubusercontent.com/incubaid/herolib/refs/heads/development/scripts/install_v.sh' > /tmp/install_v.sh
bash /tmp/install_v.sh --herolib bash /tmp/install_v.sh --herolib
" "
osal.execute_stdout(cmd) or { return error('Cannot install hero.\n${err}') } osal.execute_stdout(cmd) or { return error('Cannot install hero.\n${err}') }

View File

@@ -1,51 +0,0 @@
# Base Object and Actor Backend
This is Heros backend, designed around the concept of base objects and actors to enable modular, domain-specific operations.
## Base Object
Base objects are digital representations of real-world entities. Examples include projects, publications, books, stories (agile), and calendar events. These objects:
• Serve as the primary data units that actors operate on.
• Contain indexable fields for efficient retrieval.
• Share a common base class with attributes like:
• Name: The objects identifier.
• Description: A brief summary of the object.
• Remarks: A list of additional notes or metadata.
Base objects are stored, indexed, retrieved, and updated using OSIS (Object Storage and Indexing System).
## Actor
Actors are domain-specific operation handlers that work on base objects. For instance, a Project Manager Actor might manage operations on stories, sprints, or projects.
Key Features of Actors:
• Domain-Specific Languages (DSLs): Actor methods form intuitive, logical DSLs for interacting with base objects.
• Specification-Driven:
• Actors are generated from specifications.
• Code written for actor methods can be parsed back into specifications.
• Code Generation: Specifications enable automated boilerplate code generation, reducing manual effort.
## Modules
### OSIS: Object Storage and Indexing System
OSIS is a module designed for efficient storage and indexing of root objects based on specific fields. It enables seamless management of data across various backends, with built-in support for field-based filtering and searching.
#### Key Components
**Indexer:**
* Creates and manages SQL tables based on base object specifications.
* Enables indexing of specific fields, making them searchable and filterable.
**Storer**:
* Handles actual data storage in different databases.
* Supports diverse encoding and encryption methods for secure data management.
By integrating OSIS, the backend achieves both high-performance data querying and flexible, secure storage solutions.
### Example Actor Module
The Example Actor module is a reference and testable example of a generated actor within Baobab. It demonstrates the structure of actor modules generated from specifications and can also be parsed back into specifications. This module serves two key purposes:
1. Acts as a reference for developers working on Baobab to understand and program against actor specifications.
2. Provides a compilable, generatable module for testing and validating Baobabs code generation tools.

View File

@@ -1,68 +0,0 @@
module actor
import json
import incubaid.herolib.clients.redisclient
import incubaid.herolib.baobab.action { ProcedureCall, ProcedureResponse }
// Processor struct for managing procedure calls
pub struct Client {
pub mut:
rpc redisclient.RedisRpc // Redis RPC mechanism
}
// Parameters for processing a procedure call
@[params]
pub struct Params {
pub:
timeout int = 60 // Timeout in seconds
}
pub struct ClientConfig {
pub:
redis_url string // url to redis server running
redis_queue string // name of redis queue
}
pub fn new_client(config ClientConfig) !Client {
mut redis := redisclient.new(config.redis_url)!
mut rpc_q := redis.rpc_get(config.redis_queue)
return Client{
rpc: rpc_q
}
}
// Process the procedure call
pub fn (mut p Client) monologue(call ProcedureCall, params Params) ! {
// Use RedisRpc's `call` to send the call and wait for the response
response_data := p.rpc.call(redisclient.RPCArgs{
cmd: call.method
data: call.params
timeout: u64(params.timeout * 1000) // Convert seconds to milliseconds
wait: true
})!
// TODO: check error type
}
// Process the procedure call
pub fn (mut p Client) call_to_action(action Procedure, params Params) !ProcedureResponse {
// Use RedisRpc's `call` to send the call and wait for the response
response_data := p.rpc.call(redisclient.RPCArgs{
cmd: call.method
data: call.params
timeout: u64(params.timeout * 1000) // Convert seconds to milliseconds
wait: true
}) or {
// TODO: check error type
return ProcedureResponse{
error: err.msg()
}
// return ProcedureError{
// reason: .timeout
// }
}
return ProcedureResponse{
result: response_data
}
}

View File

@@ -1,63 +0,0 @@
# Generator
The Generator synchronizes actor code and specifications, allowing bidirectional transformation between the two.
This a
## Development Workflow
A sample development workflow using the generator would be like:
1. generating actor specification from an actor openrpc / openapi specification (see [specification reflection](specification/#reflection))
2. generating actor code from the actor specification
3. updating actor code by filling in method prototypes
4. adding methods to the actor to develop actor further
5. parsing specification back from actor
6. regenerating actor from the specification
this allows for
- a tool which takes dir as input
- is just some v files which define models
- outputs a generated code dir with
- heroscript to memory for the model
- supporting v script for manipulated model
- name of actor e.g. ProjectManager, module would be project_manager
## how does the actor work
- is a global e.g. projectmanager_factory
- with double map
- key1: cid
- object: ProjectManager Object
- Object: Project Manager
- has as properties:
- db_$rootobjectname which is map
- key: oid
- val: the Model which represents the rootobject
- on factory
- actions_process
- process heroscript through path or text (params)
- action_process
- take 1 action as input
- ${rootobjectname}_export
- export all known objects as heroscript in chosen dir
- name of heroscript would be ${rootobjectname}_define.md
- ${rootobjectname}_get(oid)
- returns rootobject as copy
- ${rootobjectname}_list()!
- returns list as copy
- ${rootobjectname}_set(oid,obj)!
- ${rootobjectname}_delete(oid)!
- ${rootobjectname}_new()!
- in action we have
- define
- export/import
- get
- list

View File

@@ -1,118 +0,0 @@
module generator
import incubaid.herolib.develop.codetools as code
import incubaid.herolib.core.texttools
import incubaid.herolib.schemas.jsonschema.codegen
import incubaid.herolib.schemas.openrpc.codegen as openrpc_codegen
import incubaid.herolib.baobab.specification
import net.http
// pub enum BaseObjectMethodType {
// new
// get
// set
// delete
// list
// other
// }
// pub struct BaseObjectMethod {
// pub:
// typ BaseObjectMethodType
// object string // the name of the base object
// }
// pub fn ts_client_get_fn(object string, params TSClientFunctionParams) string {
// name_snake := texttools.snake_case(object)
// name_pascal := texttools.pascal_case(object)
// root := get_endpoint_root(params.endpoint)
// return "async get${name_pascal}(id: string): Promise<${name_pascal}> {\n return this.restClient.get<${name_pascal}>(`/${root}/${name_snake}/\${id}`);\n }"
// }
// pub fn ts_client_set_fn(object string, params TSClientFunctionParams) string {
// name_snake := texttools.snake_case(object)
// name_pascal := texttools.pascal_case(object)
// root := get_endpoint_root(params.endpoint)
// return "async set${name_pascal}(id: string, ${name_snake}: Partial<${name_pascal}>): Promise<${name_pascal}> {\n return this.restClient.put<${name_pascal}>(`/${root}/${name_snake}/\${id}`, ${name_snake});\n }"
// }
// pub fn ts_client_delete_fn(object string, params TSClientFunctionParams) string {
// name_snake := texttools.snake_case(object)
// name_pascal := texttools.pascal_case(object)
// root := get_endpoint_root(params.endpoint)
// return "async delete${name_pascal}(id: string): Promise<void> {\n return this.restClient.delete<void>(`/${root}/${name_snake}/\${id}`);\n }"
// }
// pub fn ts_client_list_fn(object string, params TSClientFunctionParams) string {
// name_snake := texttools.snake_case(object)
// name_pascal := texttools.pascal_case(object)
// root := get_endpoint_root(params.endpoint)
// return "async list${name_pascal}(): Promise<${name_pascal}[]> {\n return this.restClient.get<${name_pascal}[]>(`/${root}/${name_snake}`);\n }"
// }
fn get_endpoint_root(root string) string {
return if root == '' {
''
} else {
'/${root.trim('/')}'
}
}
// // generates a Base Object's `create` method
// pub fn ts_client_new_fn(object string, params TSClientFunctionParams) string {
// name_snake := texttools.snake_case(object)
// name_pascal := texttools.pascal_case(object)
// root := get_endpoint_root(params.endpoint)
// return "async create${name_snake}(object: Omit<${name_pascal}, 'id'>): Promise<${name_pascal}> {
// return this.restClient.post<${name_pascal}>('${root}/${name_snake}', board);
// }"
// }
// pub fn ts_client_get_fn(object string, params TSClientFunctionParams) string {
// name_snake := texttools.snake_case(object)
// name_pascal := texttools.pascal_case(object)
// root := get_endpoint_root(params.endpoint)
// return "async get${name_pascal}(id: string): Promise<${name_pascal}> {\n return this.restClient.get<${name_pascal}>(`/${root}/${name_snake}/\${id}`);\n }"
// }
// pub fn ts_client_set_fn(object string, params TSClientFunctionParams) string {
// name_snake := texttools.snake_case(object)
// name_pascal := texttools.pascal_case(object)
// root := get_endpoint_root(params.endpoint)
// return "async set${name_pascal}(id: string, ${name_snake}: Partial<${name_pascal}>): Promise<${name_pascal}> {\n return this.restClient.put<${name_pascal}>(`/${root}/${name_snake}/\${id}`, ${name_snake});\n }"
// }
// pub fn ts_client_delete_fn(object string, params TSClientFunctionParams) string {
// name_snake := texttools.snake_case(object)
// name_pascal := texttools.pascal_case(object)
// root := get_endpoint_root(params.endpoint)
// return "async delete${name_pascal}(id: string): Promise<void> {\n return this.restClient.delete<void>(`/${root}/${name_snake}/\${id}`);\n }"
// }
// pub fn ts_client_list_fn(object string, params TSClientFunctionParams) string {
// name_snake := texttools.snake_case(object)
// name_pascal := texttools.pascal_case(object)
// root := get_endpoint_root(params.endpoint)
// return "async list${name_pascal}(): Promise<${name_pascal}[]> {\n return this.restClient.get<${name_pascal}[]>(`/${root}/${name_snake}`);\n }"
// }
// // generates a function prototype given an `ActorMethod`
// pub fn ts_client_fn_prototype(method ActorMethod) string {
// name := texttools.pascal_case(method.name)
// params := method.parameters
// .map(content_descriptor_to_parameter(it) or {panic(err)})
// .map(it.typescript())
// .join(', ')
// return_type := content_descriptor_to_parameter(method.result) or {panic(err)}.typ.typescript()
// return 'async ${name}(${params}): Promise<${return_type}> {}'
// }

View File

@@ -1,205 +0,0 @@
module generator
import x.json2 as json
import arrays
import incubaid.herolib.develop.codetools as code
import incubaid.herolib.baobab.specification
import incubaid.herolib.schemas.openrpc
import incubaid.herolib.schemas.jsonschema
const specification = specification.ActorSpecification{
name: 'Pet Store'
description: 'A sample API for a pet store'
structure: code.Struct{}
interfaces: [.openapi]
methods: [
specification.ActorMethod{
name: 'listPets'
summary: 'List all pets'
example: openrpc.ExamplePairing{
params: [
openrpc.ExampleRef(openrpc.Example{
name: 'Example limit'
description: 'Example Maximum number of pets to return'
value: 10
}),
]
result: openrpc.ExampleRef(openrpc.Example{
name: 'Example response'
value: json.raw_decode('[
{"id": 1, "name": "Fluffy", "tag": "dog"},
{"id": 2, "name": "Whiskers", "tag": "cat"}
]')!
})
}
parameters: [
openrpc.ContentDescriptor{
name: 'limit'
summary: 'Maximum number of pets to return'
description: 'Maximum number of pets to return'
required: false
schema: jsonschema.SchemaRef(jsonschema.Schema{
...jsonschema.schema_u32
example: 10
})
},
]
result: openrpc.ContentDescriptor{
name: 'pets'
description: 'A paged array of pets'
schema: jsonschema.SchemaRef(jsonschema.Schema{
typ: 'array'
items: jsonschema.Items(jsonschema.SchemaRef(jsonschema.Schema{
id: 'pet'
title: 'Pet'
typ: 'object'
properties: {
'id': jsonschema.SchemaRef(jsonschema.Reference{
ref: '#/components/schemas/PetId'
})
'name': jsonschema.SchemaRef(jsonschema.Schema{
typ: 'string'
})
'tag': jsonschema.SchemaRef(jsonschema.Schema{
typ: 'string'
})
}
required: [
'id',
'name',
]
}))
})
}
errors: [
openrpc.ErrorSpec{
code: 400
message: 'Invalid request'
},
]
},
specification.ActorMethod{
name: 'createPet'
summary: 'Create a new pet'
example: openrpc.ExamplePairing{
result: openrpc.ExampleRef(openrpc.Example{
name: 'Example response'
value: '[]'
})
}
result: openrpc.ContentDescriptor{
name: 'result'
description: 'The response of the operation.'
required: true
}
errors: [
openrpc.ErrorSpec{
code: 400
message: 'Invalid input'
},
]
},
specification.ActorMethod{
name: 'getPet'
summary: 'Get a pet by ID'
example: openrpc.ExamplePairing{
params: [
openrpc.ExampleRef(openrpc.Example{
name: 'Example petId'
description: 'Example ID of the pet to retrieve'
value: 1
}),
]
result: openrpc.ExampleRef(openrpc.Example{
name: 'Example response'
value: json.raw_decode('{"id": 1, "name": "Fluffy", "tag": "dog"}')!
})
}
parameters: [
openrpc.ContentDescriptor{
name: 'petId'
summary: 'ID of the pet to retrieve'
description: 'ID of the pet to retrieve'
required: true
schema: jsonschema.SchemaRef(jsonschema.Schema{
...jsonschema.schema_u32
format: 'uint32'
example: 1
})
},
]
result: openrpc.ContentDescriptor{
name: 'result'
description: 'The response of the operation.'
required: true
schema: jsonschema.SchemaRef(jsonschema.Reference{
ref: '#/components/schemas/Pet'
})
}
errors: [
openrpc.ErrorSpec{
code: 404
message: 'Pet not found'
},
]
},
specification.ActorMethod{
name: 'deletePet'
summary: 'Delete a pet by ID'
example: openrpc.ExamplePairing{
params: [
openrpc.ExampleRef(openrpc.Example{
name: 'Example petId'
description: 'Example ID of the pet to delete'
value: 1
}),
]
}
parameters: [
openrpc.ContentDescriptor{
name: 'petId'
summary: 'ID of the pet to delete'
description: 'ID of the pet to delete'
required: true
schema: jsonschema.SchemaRef(jsonschema.Schema{
...jsonschema.schema_u32
example: 1
})
},
]
result: openrpc.ContentDescriptor{
name: 'result'
description: 'The response of the operation.'
required: true
}
errors: [
openrpc.ErrorSpec{
code: 404
message: 'Pet not found'
},
]
},
]
objects: [
specification.BaseObject{
schema: jsonschema.Schema{
title: 'Pet'
typ: 'object'
properties: {
'id': jsonschema.schema_u32
'name': jsonschema.SchemaRef(jsonschema.Schema{
typ: 'string'
})
'tag': jsonschema.SchemaRef(jsonschema.Schema{
typ: 'string'
})
}
required: ['id', 'name']
}
},
]
}
fn test_typescript_client_folder() {
client := typescript_client_folder(specification)
}

View File

@@ -1,47 +0,0 @@
module generator
// pub fn generate_object_code(actor Struct, object BaseObject) VFile {
// obj_name := texttools.snake_case(object.structure.name)
// object_type := object.structure.name
// mut items := []CodeItem{}
// items = [generate_new_method(actor, object), generate_get_method(actor, object),
// generate_set_method(actor, object), generate_delete_method(actor, object),
// generate_list_result_struct(actor, object), generate_list_method(actor, object)]
// items << generate_object_methods(actor, object)
// mut file := code.new_file(
// mod: texttools.name_fix(actor.name)
// name: obj_name
// imports: [
// Import{
// mod: object.structure.mod
// types: [object_type]
// },
// Import{
// mod: 'incubaid.herolib.baobab.backend'
// types: ['FilterParams']
// },
// ]
// items: items
// )
// if object.structure.fields.any(it.attrs.any(it.name == 'index')) {
// // can't filter without indices
// filter_params := generate_filter_params(actor, object)
// file.items << filter_params.map(CodeItem(it))
// file.items << generate_filter_method(actor, object)
// }
// return file
// }
// pub fn (a Actor) generate_model_files() ![]VFile {
// structs := a.objects.map(it.structure)
// return a.objects.map(code.new_file(
// mod: texttools.name_fix(a.name)
// name: '${texttools.name_fix(it.structure.name)}_model'
// // imports: [Import{mod:'incubaid.herolib.baobab.stage'}]
// items: [it.structure]
// ))
// }

View File

@@ -1,406 +0,0 @@
module generator
import incubaid.herolib.baobab.specification
import incubaid.herolib.develop.codetools as code { Param, Param, type_from_symbol }
import incubaid.herolib.core.texttools
const id_param = Param{
name: 'id'
typ: type_from_symbol('u32')
}
// pub fn generate_object_code(actor Struct, object BaseObject) VFile {
// obj_name := texttools.snake_case(object.structure.name)
// object_type := object.structure.name
// mut items := []CodeItem{}
// items = [generate_new_method(actor, object), generate_get_method(actor, object),
// generate_set_method(actor, object), generate_delete_method(actor, object),
// generate_list_result_struct(actor, object), generate_list_method(actor, object)]
// items << generate_object_methods(actor, object)
// mut file := code.new_file(
// mod: texttools.name_fix(actor.name)
// name: obj_name
// imports: [
// Import{
// mod: object.structure.mod
// types: [object_type]
// },
// Import{
// mod: 'incubaid.herolib.baobab.backend'
// types: ['FilterParams']
// },
// ]
// items: items
// )
// if object.structure.fields.any(it.attrs.any(it.name == 'index')) {
// // can't filter without indices
// filter_params := generate_filter_params(actor, object)
// file.items << filter_params.map(CodeItem(it))
// file.items << generate_filter_method(actor, object)
// }
// return file
// }
// // generate_object_methods generates CRUD actor methods for a provided structure
// fn generate_get_method(actor Struct, object BaseObject) Function {
// object_name := texttools.snake_case(object.structure.name)
// object_type := object.structure.name
// get_method := Function{
// name: 'get_${object_name}'
// description: 'gets the ${object_name} with the given object id'
// receiver: Param{
// mutable: true
// name: 'actor'
// typ: type_from_symbol(actor.name)
// }
// }
// params: [generator.id_param]
// result: Param{
// typ: type_from_symbol(object.structure.name)
// is_result: true
// }
// body: 'return actor.backend.get[${object_type}](id)!'
// }
// return get_method
// }
// // generate_object_methods generates CRUD actor methods for a provided structure
// fn generate_set_method(actor Struct, object BaseObject) Function {
// object_name := texttools.snake_case(object.structure.name)
// object_type := object.structure.name
// param_getters := generate_param_getters(
// structure: object.structure
// prefix: ''
// only_mutable: true
// )
// body := 'actor.backend.set[${object_type}](${object_name})!'
// get_method := Function{
// name: 'set_${object_name}'
// description: 'updates the ${object.structure.name} with the given object id'
// receiver: Param{
// mutable: true
// name: 'actor'
// typ: type_from_symbol(actor.name)
// }
// }
// params: [
// Param{
// name: object_name
// typ: Type{
// symbol: object_type
// }
// },
// ]
// result: Param{
// is_result: true
// }
// body: body
// }
// return get_method
// }
// // generate_object_methods generates CRUD actor methods for a provided structure
// fn generate_delete_method(actor Struct, object BaseObject) Function {
// object_name := texttools.snake_case(object.structure.name)
// object_type := object.structure.name
// body := 'actor.backend.delete[${object_type}](id)!'
// get_method := Function{
// name: 'delete_${object_name}'
// description: 'deletes the ${object.structure.name} with the given object id'
// receiver: Param{
// mutable: true
// name: 'actor'
// typ: Type{
// symbol: actor.name
// }
// }
// params: [generator.id_param]
// result: Param{
// is_result: true
// }
// body: body
// }
// return get_method
// }
// // generate_object_methods generates CRUD actor methods for a provided structure
// fn generate_new_method(actor Struct, object BaseObject) Function {
// object_name := texttools.snake_case(object.structure.name)
// object_type := object.structure.name
// param_getters := generate_param_getters(
// structure: object.structure
// prefix: ''
// only_mutable: false
// )
// body := 'return actor.backend.new[${object_type}](${object_name})!'
// new_method := Function{
// name: 'new_${object_name}'
// description: 'news the ${object.structure.name} with the given object id'
// receiver: Param{
// name: 'actor'
// typ: Type{
// symbol: actor.name
// }
// mutable: true
// }
// params: [
// Param{
// name: object_name
// typ: Type{
// symbol: object_type
// }
// },
// ]
// result: Param{
// is_result: true
// typ: Type{
// symbol: 'u32'
// }
// }
// body: body
// }
// return new_method
// }
// // generate_object_methods generates CRUD actor methods for a provided structure
// fn generate_list_result_struct(actor Struct, object BaseObject) Struct {
// object_name := texttools.snake_case(object.structure.name)
// object_type := object.structure.name
// return Struct{
// name: '${object_type}List'
// is_pub: true
// fields: [
// StructField{
// name: 'items'
// typ: Type{
// symbol: '[]${object_type}'
// }
// },
// ]
// }
// }
// // generate_object_methods generates CRUD actor methods for a provided structure
// fn generate_list_method(actor Struct, object BaseObject) Function {
// object_name := texttools.snake_case(object.structure.name)
// object_type := object.structure.name
// list_struct := Struct{
// name: '${object_type}List'
// fields: [
// StructField{
// name: 'items'
// typ: Type{
// symbol: '[]${object_type}'
// }
// },
// ]
// }
// param_getters := generate_param_getters(
// structure: object.structure
// prefix: ''
// only_mutable: false
// )
// body := 'return ${object_type}List{items:actor.backend.list[${object_type}]()!}'
// result_struct := generate_list_result_struct(actor, object)
// mut result := Param{}
// result.typ.symbol = result_struct.name
// result.is_result = true
// new_method := Function{
// name: 'list_${object_name}'
// description: 'lists all of the ${object_name} objects'
// receiver: Param{
// name: 'actor'
// typ: Type{
// symbol: actor.name
// }
// mutable: true
// }
// params: []
// result: result
// body: body
// }
// return new_method
// }
// fn generate_filter_params(actor Struct, object BaseObject) []Struct {
// object_name := texttools.snake_case(object.structure.name)
// object_type := object.structure.name
// return [
// Struct{
// name: 'Filter${object_type}Params'
// fields: [
// StructField{
// name: 'filter'
// typ: Type{
// symbol: '${object_type}Filter'
// }
// },
// StructField{
// name: 'params'
// typ: Type{
// symbol: 'FilterParams'
// }
// },
// ]
// },
// Struct{
// name: '${object_type}Filter'
// fields: object.structure.fields.filter(it.attrs.any(it.name == 'index'))
// },
// ]
// }
// // generate_object_methods generates CRUD actor methods for a provided structure
// fn generate_filter_method(actor Struct, object BaseObject) Function {
// object_name := texttools.snake_case(object.structure.name)
// object_type := object.structure.name
// param_getters := generate_param_getters(
// structure: object.structure
// prefix: ''
// only_mutable: false
// )
// params_type := 'Filter${object_type}Params'
// body := 'return actor.backend.filter[${object_type}, ${object_type}Filter](filter.filter, filter.params)!'
// return Function{
// name: 'filter_${object_name}'
// description: 'lists all of the ${object_name} objects'
// receiver: Param{
// name: 'actor'
// typ: Type{
// symbol: actor.name
// }
// mutable: true
// }
// params: [
// Param{
// name: 'filter'
// typ: Type{
// symbol: params_type
// }
// },
// ]
// result: Param{
// typ: Type{
// symbol: '[]${object_type}'
// }
// is_result: true
// }
// body: body
// }
// }
// // // generate_object_methods generates CRUD actor methods for a provided structure
// // fn generate_object_methods(actor Struct, object BaseObject) []Function {
// // object_name := texttools.snake_case(object.structure.name)
// // object_type := object.structure.name
// // mut funcs := []Function{}
// // for method in object.methods {
// // mut params := [Param{
// // name: 'id'
// // typ: Type{
// // symbol: 'u32'
// // }
// // }]
// // params << method.params
// // funcs << Function{
// // name: method.name
// // description: method.description
// // receiver: Param{
// // name: 'actor'
// // typ: Type{
// // symbol: actor.name
// // }
// // mutable: true
// // }
// // params: params
// // result: method.result
// // body: 'obj := actor.backend.get[${method.receiver.typ.symbol}](id)!
// // obj.${method.name}(${method.params.map(it.name).join(',')})
// // actor.backend.set[${method.receiver.typ.symbol}](obj)!
// // '
// // }
// // }
// // return funcs
// // }
// @[params]
// struct GenerateParamGetters {
// structure Struct
// prefix string
// only_mutable bool // if true generates param.get methods for only mutable struct fields. Used for updating.
// }
// fn generate_param_getters(params GenerateParamGetters) []string {
// mut param_getters := []string{}
// fields := if params.only_mutable {
// params.structure.fields.filter(it.is_mut && it.is_pub)
// } else {
// params.structure.fields.filter(it.is_pub)
// }
// for field in fields {
// if field.typ.symbol.starts_with_capital() {
// subgetters := generate_param_getters(GenerateParamGetters{
// ...params
// structure: field.structure
// prefix: '${field.name}_'
// })
// // name of the tested object, used for param declaration
// // ex: fruits []Fruit becomes fruit_name
// nested_name := field.structure.name.to_lower()
// if field.typ.is_map {
// param_getters.insert(0, '${nested_name}_key := params.get(\'${nested_name}_key\')!')
// param_getters << '${field.name}: {${nested_name}_key: ${field.structure.name}}{'
// } else if field.typ.is_array {
// param_getters << '${field.name}: [${field.structure.name}{'
// } else {
// param_getters << '${field.name}: ${field.structure.name}{'
// }
// param_getters << subgetters
// param_getters << if field.typ.is_array { '}]' } else { '}' }
// continue
// }
// mut get_method := '${field.name}: params.get'
// if field.typ.symbol != 'string' {
// // TODO: check if params method actually exists
// 'get_${field.typ.symbol}'
// }
// if field.default != '' {
// get_method += '_default'
// }
// get_method = get_method + "('${params.prefix}${field.name}')!"
// param_getters << get_method
// }
// return param_getters
// }
// @[params]
// struct GetChildField {
// parent Struct @[required]
// child Struct @[required]
// }
// fn get_child_field(params GetChildField) StructField {
// fields := params.parent.fields.filter(it.typ.symbol == 'map[string]&${params.child.name}')
// if fields.len != 1 {
// panic('this should never happen')
// }
// return fields[0]
// }

View File

@@ -1,168 +0,0 @@
module generator
import incubaid.herolib.develop.codetools as code
import incubaid.herolib.baobab.specification
import rand
import incubaid.herolib.core.texttools
// // generate_object_methods generates CRUD actor methods for a provided structure
// pub fn generate_object_test_code(actor Struct, object BaseObject) !VFile {
// consts := CustomCode{"const db_dir = '\${os.home_dir()}/hero/db'
// const actor_name = '${actor.name}_test_actor'"}
// clean_code := 'mut actor := get(name: actor_name)!\nactor.backend.reset()!'
// testsuite_begin := Function{
// name: 'testsuite_begin'
// body: clean_code
// }
// testsuite_end := Function{
// name: 'testsuite_end'
// body: clean_code
// }
// actor_name := texttools.name_fix(actor.name)
// object_name := texttools.snake_case(object.schema.name)
// object_type := object.structure.name
// // TODO: support modules outside of crystal
// mut file := VFile{
// name: '${object_name}_test'
// mod: texttools.name_fix(actor_name)
// imports: [
// Import{
// mod: 'os'
// },
// Import{
// mod: '${object.structure.mod}'
// types: [object_type]
// },
// ]
// items: [
// consts,
// testsuite_begin,
// testsuite_end,
// generate_new_method_test(actor, object)!,
// generate_get_method_test(actor, object)!,
// ]
// }
// if object.structure.fields.any(it.attrs.any(it.name == 'index')) {
// // can't filter without indices
// file.items << generate_filter_test(actor, object)!
// }
// return file
// }
// // generate_object_methods generates CRUD actor methods for a provided structure
// fn generate_new_method_test(actor Struct, object BaseObject) !Function {
// object_name := texttools.snake_case(object.structure.name)
// object_type := object.structure.name
// required_fields := object.structure.fields.filter(it.attrs.any(it.name == 'required'))
// mut fields := []string{}
// for field in required_fields {
// mut field_decl := '${field.name}: ${get_mock_value(field.typ.symbol())!}'
// fields << field_decl
// }
// body := 'mut actor := get(name: actor_name)!
// mut ${object_name}_id := actor.new_${object_name}(${object_type}{${fields.join(',')}})!
// assert ${object_name}_id == 1
// ${object_name}_id = actor.new_${object_name}(${object_type}{${fields.join(',')}})!
// assert ${object_name}_id == 2'
// return Function{
// name: 'test_new_${object_name}'
// description: 'news the ${object_type} with the given object id'
// result: code.Param{
// is_result: true
// }
// body: body
// }
// }
// // generate_object_methods generates CRUD actor methods for a provided structure
// fn generate_get_method_test(actor Struct, object BaseObject) !Function {
// object_name := texttools.snake_case(object.structure.name)
// object_type := object.structure.name
// required_fields := object.structure.fields.filter(it.attrs.any(it.name == 'required'))
// mut fields := []string{}
// for field in required_fields {
// mut field_decl := '${field.name}: ${get_mock_value(field.typ.symbol())!}'
// fields << field_decl
// }
// body := 'mut actor := get(name: actor_name)!
// mut ${object_name} := ${object_type}{${fields.join(',')}}
// ${object_name}.id = actor.new_${object_name}(${object_name})!
// assert ${object_name} == actor.get_${object_name}(${object_name}.id)!'
// return Function{
// name: 'test_get_${object_name}'
// description: 'news the ${object_type} with the given object id'
// result: code.Param{
// is_result: true
// }
// body: body
// }
// }
// // generate_object_methods generates CRUD actor methods for a provided structure
// fn generate_filter_test(actor Struct, object BaseObject) !Function {
// object_name := texttools.snake_case(object.structure.name)
// object_type := object.structure.name
// index_fields := object.structure.fields.filter(it.attrs.any(it.name == 'index'))
// if index_fields.len == 0 {
// return error('Cannot generate filter method test for object without any index fields')
// }
// mut index_tests := []string{}
// for i, field in index_fields {
// val := get_mock_value(field.typ.symbol())!
// index_field := '${field.name}: ${val}' // index field assignment line
// mut fields := [index_field]
// fields << get_required_fields(object.structure)!
// index_tests << '${object_name}_id${i} := actor.new_${object_name}(${object_type}{${fields.join(',')}})!
// ${object_name}_list${i} := actor.filter_${object_name}(
// filter: ${object_type}Filter{${index_field}}
// )!
// assert ${object_name}_list${i}.len == 1
// assert ${object_name}_list${i}[0].${field.name} == ${val}
// '
// }
// body := 'mut actor := get(name: actor_name)!
// \n${index_tests.join('\n\n')}'
// return Function{
// name: 'test_filter_${object_name}'
// description: 'news the ${object_type} with the given object id'
// result: code.Param{
// is_result: true
// }
// body: body
// }
// }
// fn get_required_fields(s Struct) ![]string {
// required_fields := s.fields.filter(it.attrs.any(it.name == 'required'))
// mut fields := []string{}
// for field in required_fields {
// fields << '${field.name}: ${get_mock_value(field.typ.symbol())!}'
// }
// return fields
// }
// fn get_mock_value(typ string) !string {
// if typ == 'string' {
// return "'mock_string_${rand.string(3)}'"
// } else if typ == 'int' || typ == 'u32' {
// return '42'
// } else {
// return error('mock values for types other than strings and numbers are not yet supported')
// }
// }

View File

@@ -1,179 +0,0 @@
module generator
import incubaid.herolib.develop.codetools as code { CodeItem, CustomCode, Function, Import, Object, Param, Result, VFile }
import incubaid.herolib.core.texttools
import incubaid.herolib.schemas.openrpc { ContentDescriptor, Example }
import incubaid.herolib.schemas.jsonschema.codegen { schemaref_to_type }
import incubaid.herolib.baobab.specification { ActorMethod, ActorSpecification }
fn generate_handle_file(spec ActorSpecification) !VFile {
mut items := []CodeItem{}
items << CustomCode{generate_handle_function(spec)}
for method in spec.methods {
items << generate_method_handle(spec.name, method)!
}
return VFile{
name: 'act'
imports: [
Import{
mod: 'incubaid.herolib.baobab.stage'
types: ['Action']
},
Import{
mod: 'incubaid.herolib.core.texttools'
},
Import{
mod: 'x.json2 as json'
},
]
items: items
}
}
pub fn generate_handle_function(spec ActorSpecification) string {
actor_name_pascal := texttools.pascal_case(spec.name)
mut operation_handlers := []string{}
mut routes := []string{}
// Iterate over OpenAPI paths and operations
for method in spec.methods {
operation_id := method.name
params := method.parameters.map(it.name).join(', ')
// Generate route case
route := generate_route_case(operation_id, 'handle_${operation_id}')
routes << route
}
// Combine the generated handlers and main router into a single file
return [
'// AUTO-GENERATED FILE - DO NOT EDIT MANUALLY',
'',
'pub fn (mut actor ${actor_name_pascal}Actor) act(action Action) !Action {',
' return match texttools.snake_case(action.name) {',
routes.join('\n'),
' else {',
' return error("Unknown operation: \${action.name}")',
' }',
' }',
'}',
].join('\n')
}
pub fn generate_method_handle(actor_name string, method ActorMethod) !Function {
actor_name_pascal := texttools.pascal_case(actor_name)
name_fixed := texttools.snake_case(method.name)
mut body := ''
if method.parameters.len == 1 {
param := method.parameters[0]
param_name := texttools.snake_case(param.name)
decode_stmt := generate_decode_stmt('action.params', param)!
body += '${param_name} := ${decode_stmt}\n'
}
if method.parameters.len > 1 {
body += 'params_arr := json.raw_decode(action.params)!.arr()\n'
for i, param in method.parameters {
param_name := texttools.snake_case(param.name)
decode_stmt := generate_decode_stmt('params_arr[${i}].str()', param)!
body += '${param_name} := ${decode_stmt}'
}
}
call_stmt := generate_call_stmt(actor_name, method)!
body += '${call_stmt}\n'
body += '${generate_return_stmt(method)!}\n'
return Function{
name: 'handle_${name_fixed}'
description: '// Handler for ${name_fixed}\n'
receiver: Param{
name: 'actor'
mutable: true
typ: Object{'${actor_name_pascal}Actor'}
}
params: [Param{
name: 'action'
typ: Object{'Action'}
}]
result: Param{
typ: Result{Object{'Action'}}
}
body: body
}
}
fn method_is_void(method ActorMethod) !bool {
return schemaref_to_type(method.result.schema).vgen().trim_space() == ''
}
pub fn generate_example_method_handle(actor_name string, method ActorMethod) !Function {
actor_name_pascal := texttools.pascal_case(actor_name)
name_fixed := texttools.snake_case(method.name)
body := if !method_is_void(method)! {
if method.example.result is Example {
'return Action{...action, result: json.encode(\'${method.example.result.value}\')}'
} else {
'return action'
}
} else {
'return action'
}
return Function{
name: 'handle_${name_fixed}_example'
description: '// Handler for ${name_fixed}\n'
receiver: Param{
name: 'actor'
mutable: true
typ: Object{'${actor_name_pascal}Actor'}
}
params: [Param{
name: 'action'
typ: Object{'Action'}
}]
result: Param{
typ: Result{Object{'Action'}}
}
body: body
}
}
fn generate_call_stmt(name string, method ActorMethod) !string {
mut call_stmt := if schemaref_to_type(method.result.schema).vgen().trim_space() != '' {
'${texttools.snake_case(method.result.name)} := '
} else {
''
}
method_name := texttools.snake_case(method.name)
snake_name := texttools.snake_case(name)
param_names := method.parameters.map(texttools.snake_case(it.name))
call_stmt += 'actor.${snake_name}.${method_name}(${param_names.join(', ')})!'
return call_stmt
}
fn generate_return_stmt(method ActorMethod) !string {
if schemaref_to_type(method.result.schema).vgen().trim_space() != '' {
return 'return Action{...action, result: json.encode(${texttools.snake_case(method.result.name)})}'
}
return 'return action'
}
// Helper function to generate a case block for the main router
fn generate_route_case(case string, handler_name string) string {
name_fixed := texttools.snake_case(handler_name)
return "'${texttools.snake_case(case)}' {actor.${name_fixed}(action)}"
}
// generates decode statement for variable with given name
fn generate_decode_stmt(name string, param ContentDescriptor) !string {
param_type := schemaref_to_type(param.schema)
if param_type is Object {
return 'json.decode[${schemaref_to_type(param.schema).vgen()}](${name})!'
} else if param_type is code.Array {
return 'json.decode[${schemaref_to_type(param.schema).vgen()}](${name})'
}
param_symbol := param_type.vgen()
return if param_symbol == 'string' {
'${name}.str()'
} else {
'${name}.${param_type.vgen()}()'
}
}

View File

@@ -1,82 +0,0 @@
module generator
import incubaid.herolib.develop.codetools as code { File, Folder, IFile, IFolder }
import incubaid.herolib.schemas.openapi
import incubaid.herolib.core.texttools
import incubaid.herolib.baobab.specification { ActorInterface, ActorSpecification }
import json
@[params]
pub struct Params {
pub:
interfaces []ActorInterface // the interfaces to be supported
}
pub fn generate_actor_folder(spec ActorSpecification, params Params) !Folder {
mut files := []IFile{}
mut folders := []IFolder{}
files = [generate_readme_file(spec)!]
mut docs_files := []IFile{}
mut spec_files := []IFile{}
// generate code files for supported interfaces
for iface in params.interfaces {
match iface {
.openrpc {
// convert actor spec to openrpc spec
openrpc_spec := spec.to_openrpc()
spec_files << generate_openrpc_file(openrpc_spec)!
}
.openapi {
// convert actor spec to openrpc spec
openapi_spec_raw := spec.to_openapi()
spec_files << generate_openapi_file(openapi_spec_raw)!
openapi_spec := openapi.process(openapi_spec_raw)!
folders << generate_openapi_ts_client(openapi_spec)!
}
else {}
}
}
specs_folder := Folder{
name: 'specs'
files: spec_files
}
// folder with docs
folders << Folder{
name: 'docs'
files: docs_files
folders: [specs_folder]
}
folders << generate_scripts_folder(spec.name, false)
folders << generate_examples_folder()!
// create module with code files and docs folder
name_fixed := texttools.snake_case(spec.name)
return Folder{
name: '${name_fixed}'
files: files
folders: folders
modules: [generate_actor_module(spec, params)!]
}
}
fn generate_readme_file(spec ActorSpecification) !File {
return File{
name: 'README'
extension: 'md'
content: '# ${spec.name}\n${spec.description}'
}
}
pub fn generate_examples_folder() !Folder {
return Folder{
name: 'examples'
}
}

View File

@@ -1,118 +0,0 @@
module generator
import incubaid.herolib.develop.codetools as code { CustomCode, IFile, IFolder, Module, VFile }
import incubaid.herolib.schemas.openapi
import incubaid.herolib.core.texttools
import incubaid.herolib.baobab.specification { ActorInterface, ActorSpecification }
import json
pub fn generate_module_from_openapi(openapi_path string) !string {
// the actor specification obtained from the OpenRPC Specification
openapi_spec := openapi.new(path: openapi_path)!
actor_spec := specification.from_openapi(openapi_spec)!
actor_module := generate_actor_module(actor_spec,
interfaces: [.openapi, .http]
)!
return actor_module.write_str()!
}
pub fn generate_actor_module(spec ActorSpecification, params Params) !Module {
mut files := []IFile{}
mut folders := []IFolder{}
files = [
generate_actor_file(spec)!,
generate_actor_test_file(spec)!,
generate_specs_file(spec.name, params.interfaces)!,
generate_handle_file(spec)!,
generate_methods_file(spec)!,
generate_methods_interface_file(spec)!,
generate_methods_example_file(spec)!,
generate_client_file(spec)!,
generate_model_file(spec)!,
]
// generate code files for supported interfaces
for iface in params.interfaces {
match iface {
.openrpc {
// convert actor spec to openrpc spec
openrpc_spec := spec.to_openrpc()
iface_file, iface_test_file := generate_openrpc_interface_files(params.interfaces)
files << iface_file
files << iface_test_file
}
.openapi {
// convert actor spec to openrpc spec
openapi_spec_raw := spec.to_openapi()
openapi_spec := openapi.process(openapi_spec_raw)!
// generate openrpc code files
iface_file, iface_test_file := generate_openapi_interface_files(params.interfaces)
files << iface_file
files << iface_test_file
}
.http {
// interfaces that have http controllers
controllers := params.interfaces.filter(it == .openrpc || it == .openapi)
// generate openrpc code files
iface_file, iface_test_file := generate_http_interface_files(controllers)
files << iface_file
files << iface_test_file
}
.command {
files << generate_command_file(spec)!
}
else {
return error('unsupported interface ${iface}')
}
}
}
// create module with code files and docs folder
name_fixed := texttools.snake_case(spec.name)
return code.new_module(
name: '${name_fixed}'
description: spec.description
files: files
folders: folders
in_src: true
)
}
fn generate_actor_file(spec ActorSpecification) !VFile {
dollar := '$'
version := spec.version
name_snake := texttools.snake_case(spec.name)
name_pascal := texttools.pascal_case(spec.name)
actor_code := $tmpl('./templates/actor.v.template')
return VFile{
name: 'actor'
items: [CustomCode{actor_code}]
}
}
fn generate_actor_test_file(spec ActorSpecification) !VFile {
dollar := '$'
actor_name_snake := texttools.snake_case(spec.name)
actor_name_pascal := texttools.pascal_case(spec.name)
actor_test_code := $tmpl('./templates/actor_test.v.template')
return VFile{
name: 'actor_test'
items: [CustomCode{actor_test_code}]
}
}
fn generate_specs_file(name string, interfaces []ActorInterface) !VFile {
support_openrpc := ActorInterface.openrpc in interfaces
support_openapi := ActorInterface.openapi in interfaces
dollar := '$'
actor_name_snake := texttools.snake_case(name)
actor_name_pascal := texttools.pascal_case(name)
actor_code := $tmpl('./templates/specifications.v.template')
return VFile{
name: 'specifications'
items: [CustomCode{actor_code}]
}
}

View File

@@ -1,276 +0,0 @@
module generator
import incubaid.herolib.develop.codetools as code
import incubaid.herolib.baobab.specification
import incubaid.herolib.schemas.openrpc
import incubaid.herolib.schemas.jsonschema
import os
import x.json2 as json
const actor_spec = specification.ActorSpecification{
name: 'Pet Store'
description: 'A sample API for a pet store'
structure: code.Struct{}
interfaces: [.openapi]
methods: [
specification.ActorMethod{
name: 'listPets'
summary: 'List all pets'
example: openrpc.ExamplePairing{
params: [
openrpc.ExampleRef(openrpc.Example{
name: 'Example limit'
description: 'Example Maximum number of pets to return'
value: 10
}),
]
result: openrpc.ExampleRef(openrpc.Example{
name: 'Example response'
value: json.raw_decode('[
{"id": 1, "name": "Fluffy", "tag": "dog"},
{"id": 2, "name": "Whiskers", "tag": "cat"}
]')!
})
}
parameters: [
openrpc.ContentDescriptor{
name: 'limit'
summary: 'Maximum number of pets to return'
description: 'Maximum number of pets to return'
required: false
schema: jsonschema.SchemaRef(jsonschema.Schema{
...jsonschema.schema_u32
example: 10
})
},
]
result: openrpc.ContentDescriptor{
name: 'pets'
description: 'A paged array of pets'
schema: jsonschema.SchemaRef(jsonschema.Schema{
typ: 'array'
items: jsonschema.Items(jsonschema.SchemaRef(jsonschema.Schema{
id: 'pet'
title: 'Pet'
typ: 'object'
properties: {
'id': jsonschema.SchemaRef(jsonschema.Reference{
ref: '#/components/schemas/PetId'
})
'name': jsonschema.SchemaRef(jsonschema.Schema{
typ: 'string'
})
'tag': jsonschema.SchemaRef(jsonschema.Schema{
typ: 'string'
})
}
required: [
'id',
'name',
]
}))
})
}
errors: [
openrpc.ErrorSpec{
code: 400
message: 'Invalid request'
},
]
},
specification.ActorMethod{
name: 'newPet'
summary: 'Create a new pet'
parameters: [
openrpc.ContentDescriptor{
name: 'result'
description: 'The response of the operation.'
required: true
schema: jsonschema.SchemaRef(jsonschema.Schema{
id: 'pet'
title: 'Pet'
typ: 'object'
properties: {
'id': jsonschema.SchemaRef(jsonschema.Reference{
ref: '#/components/schemas/PetId'
})
'name': jsonschema.SchemaRef(jsonschema.Schema{
typ: 'string'
})
'tag': jsonschema.SchemaRef(jsonschema.Schema{
typ: 'string'
})
}
required: ['id', 'name']
})
},
]
example: openrpc.ExamplePairing{
result: openrpc.ExampleRef(openrpc.Example{
name: 'Example response'
value: '[]'
})
}
result: openrpc.ContentDescriptor{
name: 'petId'
summary: 'ID of the created pet'
description: 'ID of the created pet'
required: true
schema: jsonschema.SchemaRef(jsonschema.Schema{
...jsonschema.schema_u32
example: 1
})
}
errors: [
openrpc.ErrorSpec{
code: 400
message: 'Invalid input'
},
]
},
specification.ActorMethod{
name: 'getPet'
summary: 'Get a pet by ID'
example: openrpc.ExamplePairing{
params: [
openrpc.ExampleRef(openrpc.Example{
name: 'Example petId'
description: 'Example ID of the pet to retrieve'
value: 1
}),
]
result: openrpc.ExampleRef(openrpc.Example{
name: 'Example response'
value: json.raw_decode('{"id": 1, "name": "Fluffy", "tag": "dog"}')!
})
}
parameters: [
openrpc.ContentDescriptor{
name: 'petId'
summary: 'ID of the pet to retrieve'
description: 'ID of the pet to retrieve'
required: true
schema: jsonschema.SchemaRef(jsonschema.Schema{
...jsonschema.schema_u32
format: 'uint32'
example: 1
})
},
]
result: openrpc.ContentDescriptor{
name: 'result'
description: 'The response of the operation.'
required: true
schema: jsonschema.SchemaRef(jsonschema.Reference{
ref: '#/components/schemas/Pet'
})
}
errors: [
openrpc.ErrorSpec{
code: 404
message: 'Pet not found'
},
]
},
specification.ActorMethod{
name: 'deletePet'
summary: 'Delete a pet by ID'
example: openrpc.ExamplePairing{
params: [
openrpc.ExampleRef(openrpc.Example{
name: 'Example petId'
description: 'Example ID of the pet to delete'
value: 1
}),
]
}
parameters: [
openrpc.ContentDescriptor{
name: 'petId'
summary: 'ID of the pet to delete'
description: 'ID of the pet to delete'
required: true
schema: jsonschema.SchemaRef(jsonschema.Schema{
...jsonschema.schema_u32
example: 1
})
},
]
result: openrpc.ContentDescriptor{
name: 'result'
description: 'The response of the operation.'
required: true
}
errors: [
openrpc.ErrorSpec{
code: 404
message: 'Pet not found'
},
]
},
]
objects: [
specification.BaseObject{
schema: jsonschema.Schema{
title: 'Pet'
typ: 'object'
properties: {
'id': jsonschema.schema_u32
'name': jsonschema.SchemaRef(jsonschema.Schema{
typ: 'string'
})
'tag': jsonschema.SchemaRef(jsonschema.Schema{
typ: 'string'
})
}
required: ['id', 'name']
}
},
]
}
const destination = '${os.dir(@FILE)}/testdata'
fn test_generate_plain_actor_module() {
// plain actor module without interfaces
actor_module := generate_actor_module(actor_spec)!
actor_module.write(destination,
format: true
overwrite: true
test: true
)!
}
fn test_generate_actor_module_with_openrpc_interface() {
// plain actor module without interfaces
actor_module := generate_actor_module(actor_spec, interfaces: [.openrpc])!
actor_module.write(destination,
format: true
overwrite: true
test: true
)!
}
fn test_generate_actor_module_with_openapi_interface() {
// plain actor module without interfaces
actor_module := generate_actor_module(actor_spec,
interfaces: [.openapi]
)!
actor_module.write(destination,
format: true
overwrite: true
test: true
)!
}
fn test_generate_actor_module_with_all_interfaces() {
// plain actor module without interfaces
actor_module := generate_actor_module(actor_spec,
interfaces: [.openapi, .openrpc, .http]
)!
actor_module.write(destination,
format: true
overwrite: true
test: true
)!
}

View File

@@ -1,136 +0,0 @@
module generator
import incubaid.herolib.develop.codetools as code { CodeItem, CustomCode, Function, Import, Param, Result, VFile }
import incubaid.herolib.core.texttools
import incubaid.herolib.schemas.jsonschema.codegen as jsonschema_codegen { schemaref_to_type }
import incubaid.herolib.schemas.openrpc.codegen { content_descriptor_to_parameter }
import incubaid.herolib.baobab.specification { ActorMethod, ActorSpecification }
pub fn generate_client_file(spec ActorSpecification) !VFile {
actor_name_snake := texttools.snake_case(spec.name)
actor_name_pascal := texttools.pascal_case(spec.name)
mut items := []CodeItem{}
items << CustomCode{'
pub struct Client {
stage.Client
}
fn new_client(config stage.ActorConfig) !Client {
return Client {
Client: stage.new_client(config)!
}
}'}
for method in spec.methods {
items << generate_client_method(method)!
}
return VFile{
imports: [
Import{
mod: 'incubaid.herolib.baobab.stage'
},
Import{
mod: 'incubaid.herolib.core.redisclient'
},
Import{
mod: 'x.json2 as json'
types: ['Any']
},
]
name: 'client_actor'
items: items
}
}
pub fn generate_example_client_file(spec ActorSpecification) !VFile {
actor_name_snake := texttools.snake_case(spec.name)
actor_name_pascal := texttools.pascal_case(spec.name)
mut items := []CodeItem{}
items << CustomCode{"
pub struct Client {
stage.Client
}
fn new_client() !Client {
mut redis := redisclient.new('localhost:6379')!
mut rpc_q := redis.rpc_get('actor_example_\${name}')
return Client{
rpc: rpc_q
}
}"}
for method in spec.methods {
items << generate_client_method(method)!
}
return VFile{
imports: [
Import{
mod: 'incubaid.herolib.baobab.stage'
},
Import{
mod: 'incubaid.herolib.core.redisclient'
},
Import{
mod: 'x.json2 as json'
types: ['Any']
},
]
name: 'client'
items: items
}
}
pub fn generate_client_method(method ActorMethod) !Function {
name_fixed := texttools.snake_case(method.name)
call_params := if method.parameters.len > 0 {
method.parameters.map(texttools.snake_case(it.name)).map('Any(${it}.str())').join(', ')
} else {
''
}
params_stmt := if method.parameters.len == 0 {
''
} else if method.parameters.len == 1 {
'params := json.encode(${texttools.snake_case(method.parameters[0].name)})'
} else {
'mut params_arr := []Any{}
params_arr = [${call_params}]
params := json.encode(params_arr.str())
'
}
mut client_call_stmt := "action := client.call_to_action(
name: '${name_fixed}'"
if params_stmt != '' {
client_call_stmt += 'params: params'
}
client_call_stmt += ')!'
result_type := schemaref_to_type(method.result.schema).vgen().trim_space()
result_stmt := if result_type == '' {
''
} else {
'return json.decode[${result_type}](action.result)!'
}
result_param := content_descriptor_to_parameter(method.result)!
return Function{
receiver: code.new_param(v: 'mut client Client')!
result: Param{
...result_param
typ: Result{result_param.typ}
}
name: name_fixed
body: '${params_stmt}\n${client_call_stmt}\n${result_stmt}'
summary: method.summary
description: method.description
params: method.parameters.map(content_descriptor_to_parameter(it)!)
}
}

View File

@@ -1,78 +0,0 @@
module generator
import incubaid.herolib.develop.codetools as code { CodeItem, CustomCode, Import, VFile }
import incubaid.herolib.core.texttools
import incubaid.herolib.baobab.specification { ActorMethod, ActorSpecification }
pub fn generate_command_file(spec ActorSpecification) !VFile {
mut items := []CodeItem{}
items << CustomCode{generate_cmd_function(spec)}
for i in spec.methods {
items << CustomCode{generate_method_cmd_function(spec.name, i)}
}
return VFile{
name: 'command'
imports: [
Import{
mod: 'incubaid.herolib.ui.console'
},
Import{
mod: 'cli'
types: ['Command', 'Flag']
},
]
items: items
}
}
pub fn generate_cmd_function(spec ActorSpecification) string {
actor_name_snake := texttools.snake_case(spec.name)
mut cmd_function := "
pub fn cmd() Command {
mut cmd := Command{
name: '${actor_name_snake}'
usage: ''
description: '${spec.description}'
}
"
mut method_cmds := []string{}
for method in spec.methods {
method_cmds << generate_method_cmd(method)
}
cmd_function += '${method_cmds.join_lines()}}'
return cmd_function
}
pub fn generate_method_cmd(method ActorMethod) string {
method_name_snake := texttools.snake_case(method.name)
return "
mut cmd_${method_name_snake} := Command{
sort_flags: true
name: '${method_name_snake}'
execute: cmd_${method_name_snake}_execute
description: '${method.description}'
}
"
}
pub fn generate_method_cmd_function(actor_name string, method ActorMethod) string {
mut operation_handlers := []string{}
mut routes := []string{}
actor_name_snake := texttools.snake_case(actor_name)
method_name_snake := texttools.snake_case(method.name)
method_call := if method.result.name == '' {
'${actor_name_snake}.${method_name_snake}()!'
} else {
'result := ${actor_name_snake}.${method_name_snake}()!'
}
return '
fn cmd_${method_name_snake}_execute(cmd Command) ! {
${method_call}
}
'
}

View File

@@ -1,48 +0,0 @@
module generator
import incubaid.herolib.baobab.specification { ActorInterface }
import incubaid.herolib.develop.codetools as code { CustomCode, VFile }
fn generate_openrpc_interface_files(interfaces []ActorInterface) (VFile, VFile) {
http := ActorInterface.http in interfaces
iface_file := VFile{
name: 'interface_openrpc'
items: [CustomCode{$tmpl('./templates/interface_openrpc.v.template')}]
}
iface_test_file := VFile{
name: 'interface_openrpc_test'
items: [CustomCode{$tmpl('./templates/interface_openrpc_test.v.template')}]
}
return iface_file, iface_test_file
}
fn generate_openapi_interface_files(interfaces []ActorInterface) (VFile, VFile) {
http := ActorInterface.http in interfaces
dollar := '$'
iface_file := VFile{
name: 'interface_openapi'
items: [CustomCode{$tmpl('./templates/interface_openapi.v.template')}]
}
iface_test_file := VFile{
name: 'interface_openapi_test'
items: [CustomCode{$tmpl('./templates/interface_openapi_test.v.template')}]
}
return iface_file, iface_test_file
}
fn generate_http_interface_files(controllers []ActorInterface) (VFile, VFile) {
dollar := '$'
openapi := ActorInterface.openapi in controllers
openrpc := ActorInterface.openrpc in controllers
iface_file := VFile{
name: 'interface_http'
items: [CustomCode{$tmpl('./templates/interface_http.v.template')}]
}
iface_test_file := VFile{
name: 'interface_http_test'
items: [CustomCode{$tmpl('./templates/interface_http_test.v.template')}]
}
return iface_file, iface_test_file
}

View File

@@ -1,165 +0,0 @@
module generator
import incubaid.herolib.develop.codetools as code { CodeItem, Function, Import, Param, Result, Struct, VFile }
import incubaid.herolib.core.texttools
import incubaid.herolib.schemas.openapi
import incubaid.herolib.schemas.openrpc
import incubaid.herolib.schemas.openrpc.codegen { content_descriptor_to_parameter, content_descriptor_to_struct }
import incubaid.herolib.schemas.jsonschema { Schema }
import incubaid.herolib.schemas.jsonschema.codegen as jsonschema_codegen
import incubaid.herolib.baobab.specification { ActorMethod, ActorSpecification }
import log
const crud_prefixes = ['new', 'get', 'set', 'delete', 'list']
pub struct Source {
openapi_path ?string
openrpc_path ?string
}
pub fn generate_methods_file_str(source Source) !string {
actor_spec := if path := source.openapi_path {
specification.from_openapi(openapi.new(path: path)!)!
} else if path := source.openrpc_path {
specification.from_openrpc(openrpc.new(path: path)!)!
} else {
panic('No openapi or openrpc path provided')
}
return generate_methods_file(actor_spec)!.write_str()!
}
pub fn generate_methods_file(spec ActorSpecification) !VFile {
name_snake := texttools.snake_case(spec.name)
actor_name_pascal := texttools.pascal_case(spec.name)
receiver := generate_methods_receiver(spec.name)
receiver_param := Param{
mutable: true
name: name_snake[0].ascii_str() // receiver is first letter of domain
typ: Result{code.Object{receiver.name}}
}
mut items := [CodeItem(receiver), CodeItem(generate_core_factory(receiver_param))]
for method in spec.methods {
items << generate_method_code(receiver_param, ActorMethod{
...method
category: spec.method_type(method)
})!
}
return VFile{
name: 'methods'
imports: [
Import{
mod: 'incubaid.herolib.baobab.osis'
types: ['OSIS']
},
]
items: items
}
}
fn generate_methods_receiver(name string) Struct {
return Struct{
is_pub: true
name: '${texttools.pascal_case(name)}'
fields: [
code.StructField{
is_mut: true
name: 'osis'
typ: code.Object{'OSIS'}
},
]
}
}
fn generate_core_factory(receiver Param) Function {
return Function{
is_pub: true
name: 'new_${receiver.typ.symbol()}'
body: 'return ${receiver.typ.symbol().trim_left('!?')}{osis: osis.new()!}'
result: receiver
}
}
// returns bodyless method prototype
pub fn generate_method_code(receiver Param, method ActorMethod) ![]CodeItem {
result_param := content_descriptor_to_parameter(method.result)!
mut method_code := []CodeItem{}
// TODO: document assumption
obj_params := method.parameters.filter(if it.schema is Schema {
it.schema.typ == 'object'
} else {
false
}).map(content_descriptor_to_struct(it))
if obj_param := obj_params[0] {
method_code << obj_param
}
// check if method is a Base Object CRUD Method and
// if so generate the method's body
// TODO: smart generation of method body using AI
// body := match method.category {
// .base_object_new { base_object_new_body(receiver, method)! }
// .base_object_get { base_object_get_body(receiver, method)! }
// .base_object_set { base_object_set_body(receiver, method)! }
// .base_object_delete { base_object_delete_body(receiver, method)! }
// .base_object_list { base_object_list_body(receiver, method)! }
// else { "panic('implement')" }
// }
body := "panic('implement')"
fn_prototype := generate_method_prototype(receiver, method)!
method_code << Function{
...fn_prototype
body: body
}
return method_code
}
// returns bodyless method prototype
pub fn generate_method_prototype(receiver Param, method ActorMethod) !Function {
result_param := content_descriptor_to_parameter(method.result)!
return Function{
name: texttools.snake_case(method.name)
receiver: receiver
result: Param{
...result_param
typ: Result{result_param.typ}
}
summary: method.summary
description: method.description
params: method.parameters.map(content_descriptor_to_parameter(it)!)
}
}
fn base_object_new_body(receiver Param, method ActorMethod) !string {
parameter := content_descriptor_to_parameter(method.parameters[0])!
return 'return ${receiver.name}.osis.new[${parameter.typ.vgen()}](${texttools.snake_case(parameter.name)})!'
}
fn base_object_get_body(receiver Param, method ActorMethod) !string {
parameter := content_descriptor_to_parameter(method.parameters[0])!
result := content_descriptor_to_parameter(method.result)!
return 'return ${receiver.name}.osis.get[${result.typ.vgen()}](${texttools.snake_case(parameter.name)})!'
}
fn base_object_set_body(receiver Param, method ActorMethod) !string {
parameter := content_descriptor_to_parameter(method.parameters[0])!
return 'return ${receiver.name}.osis.set[${parameter.typ.vgen()}](${parameter.name})!'
}
fn base_object_delete_body(receiver Param, method ActorMethod) !string {
parameter := content_descriptor_to_parameter(method.parameters[0])!
return '${receiver.name}.osis.delete(${texttools.snake_case(parameter.name)})!'
}
fn base_object_list_body(receiver Param, method ActorMethod) !string {
// result := content_descriptor_to_parameter(method.result)!
// log.error('result typ: ${result.typ}')
// base_object_type := (result.typ as Array).typ
// return 'return ${receiver.name}.osis.list[${base_object_type.symbol()}]()!'
return 'return'
}

View File

@@ -1,106 +0,0 @@
module generator
import incubaid.herolib.develop.codetools as code { CodeItem, Function, Import, Param, Result, Struct, VFile }
import incubaid.herolib.core.texttools
import incubaid.herolib.schemas.openrpc { Example }
import incubaid.herolib.schemas.jsonschema
import incubaid.herolib.schemas.jsonschema.codegen as jsonschema_codegen
import incubaid.herolib.schemas.openrpc.codegen { content_descriptor_to_parameter }
import incubaid.herolib.baobab.specification { ActorMethod, ActorSpecification }
import incubaid.herolib.schemas.openapi
pub fn generate_methods_example_file_str(source Source) !string {
actor_spec := if path := source.openapi_path {
specification.from_openapi(openapi.new(path: path)!)!
} else if path := source.openrpc_path {
specification.from_openrpc(openrpc.new(path: path)!)!
} else {
panic('No openapi or openrpc path provided')
}
return generate_methods_example_file(actor_spec)!.write_str()!
}
pub fn generate_methods_example_file(spec ActorSpecification) !VFile {
name_snake := texttools.snake_case(spec.name)
name_pascal := texttools.pascal_case(spec.name)
receiver := generate_example_methods_receiver(spec.name)
receiver_param := Param{
mutable: true
name: name_snake[0].ascii_str()
typ: Result{code.Object{receiver.name}}
}
mut items := [CodeItem(receiver), CodeItem(generate_core_example_factory(receiver_param))]
for method in spec.methods {
items << generate_method_example_code(receiver_param, ActorMethod{
...method
category: spec.method_type(method)
})!
}
return VFile{
name: 'methods_example'
imports: [
Import{
mod: 'incubaid.herolib.baobab.osis'
types: ['OSIS']
},
Import{
mod: 'x.json2 as json'
},
]
items: items
}
}
fn generate_core_example_factory(receiver Param) Function {
return Function{
is_pub: true
name: 'new_${texttools.snake_case(receiver.typ.symbol())}'
body: 'return ${receiver.typ.symbol().trim_left('!?')}{OSIS: osis.new()!}'
result: receiver
}
}
fn generate_example_methods_receiver(name string) Struct {
return Struct{
is_pub: true
name: '${texttools.pascal_case(name)}Example'
embeds: [Struct{
name: 'OSIS'
}]
}
}
// returns bodyless method prototype
pub fn generate_method_example_code(receiver Param, method ActorMethod) ![]CodeItem {
result_param := content_descriptor_to_parameter(method.result)!
mut method_code := []CodeItem{}
// TODO: document assumption
// obj_params := method.parameters.filter(if it.schema is Schema {it.schema.typ == 'object'} else {false}).map(schema_to_struct(it.schema as Schema))
// if obj_param := obj_params[0] {
// method_code << Struct{...obj_param, name: method.name}
// }
// check if method is a Base Object CRUD Method and
// if so generate the method's body
body := if !method_is_void(method)! {
if method.example.result is Example {
"json_str := '${method.example.result.value}'
return ${generate_decode_stmt('json_str',
method.result)!}"
} else {
'return ${result_param.typ.empty_value()}'
}
} else {
''
}
fn_prototype := generate_method_prototype(receiver, method)!
method_code << Function{
...fn_prototype
body: body
}
return method_code
}

View File

@@ -1,49 +0,0 @@
module generator
import incubaid.herolib.develop.codetools as code { CodeItem, Import, Param, VFile }
import incubaid.herolib.core.texttools
import incubaid.herolib.schemas.openrpc.codegen
import incubaid.herolib.baobab.specification { ActorSpecification }
import incubaid.herolib.schemas.openapi
import incubaid.herolib.schemas.openrpc
pub fn generate_methods_interface_file_str(source Source) !string {
actor_spec := if path := source.openapi_path {
specification.from_openapi(openapi.new(path: path)!)!
} else if path := source.openrpc_path {
specification.from_openrpc(openrpc.new(path: path)!)!
} else {
panic('No openapi or openrpc path provided')
}
return generate_methods_interface_file(actor_spec)!.write_str()!
}
pub fn generate_methods_interface_file(spec ActorSpecification) !VFile {
return VFile{
name: 'methods_interface'
imports: [
Import{
mod: 'incubaid.herolib.baobab.osis'
types: ['OSIS']
},
]
items: [CodeItem(generate_methods_interface_declaration(spec)!)]
}
}
// returns bodyless method prototype
pub fn generate_methods_interface_declaration(spec ActorSpecification) !code.Interface {
name_snake := texttools.snake_case(spec.name)
name_pascal := texttools.pascal_case(spec.name)
receiver := generate_methods_receiver(spec.name)
receiver_param := Param{
mutable: true
name: name_snake[0].ascii_str()
typ: code.Object{receiver.name}
}
return code.Interface{
is_pub: true
name: 'I${name_pascal}'
methods: spec.methods.map(generate_method_prototype(receiver_param, it)!)
}
}

View File

@@ -1,32 +0,0 @@
module generator
import incubaid.herolib.develop.codetools as code { CodeItem, Struct, VFile }
import incubaid.herolib.core.texttools
import incubaid.herolib.schemas.jsonschema.codegen { schema_to_struct }
import incubaid.herolib.baobab.specification { ActorSpecification }
import incubaid.herolib.schemas.openapi
import incubaid.herolib.schemas.openrpc
pub fn generate_model_file_str(source Source) !string {
actor_spec := if path := source.openapi_path {
specification.from_openapi(openapi.new(path: path)!)!
} else if path := source.openrpc_path {
specification.from_openrpc(openrpc.new(path: path)!)!
} else {
panic('No openapi or openrpc path provided')
}
return generate_model_file(actor_spec)!.write_str()!
}
pub fn generate_model_file(spec ActorSpecification) !VFile {
actor_name_snake := texttools.snake_case(spec.name)
actor_name_pascal := texttools.pascal_case(spec.name)
return VFile{
name: 'model'
items: spec.objects.map(CodeItem(Struct{
...schema_to_struct(it.schema)
is_pub: true
}))
}
}

View File

@@ -1,160 +0,0 @@
module generator
import json
import incubaid.herolib.develop.codetools as code { File, Folder }
import incubaid.herolib.schemas.openapi { OpenAPI, Operation }
import incubaid.herolib.schemas.openapi.codegen
import incubaid.herolib.schemas.jsonschema.codegen as jsonschema_codegen { schema_to_type }
import net.http
pub fn generate_openapi_file(specification OpenAPI) !File {
openapi_json := specification.encode_json()
return File{
name: 'openapi'
extension: 'json'
content: openapi_json
}
}
pub fn generate_openapi_ts_client(specification OpenAPI) !Folder {
return codegen.ts_client_folder(specification,
body_generator: body_generator
custom_client_code: ' private restClient: HeroRestClient;
constructor(heroKeysClient: any, debug: boolean = true) {
this.restClient = new HeroRestClient(heroKeysClient, debug);
}
'
)!
}
fn body_generator(op Operation, path_ string, method http.Method) string {
path := path_.replace('{', '\${')
return match method {
.post {
if schema := op.payload_schema() {
symbol := schema_to_type(schema).typescript()
"return this.restClient.post<${symbol}>('${path}', data);"
} else {
''
}
}
.get {
if schema := op.response_schema() {
// if op.params.len
symbol := schema_to_type(schema).typescript()
"return this.restClient.get<${symbol}>('${path}', data);"
} else {
''
}
}
else {
''
}
}
// return if operation_is_base_object_method(op) {
// bo_method := operation_to_base_object_method(op)
// match method_type(op) {
// .new { ts_client_new_body(op, path) }
// .get { ts_client_get_body(op, path) }
// .set { ts_client_set_body(op, path) }
// .delete { ts_client_delete_body(op, path) }
// .list { ts_client_list_body(op, path) }
// else {''}
// }
// } else {''}
}
// pub fn operation_is_base_object_method(op openapi.Operation, base_objs []string) BaseObjectMethod {
// // name := texttools.pascal_case(op.operation_id)
// // if op.operation_id.starts_with('new') {
// // if op.&& operation.params.len == 1
// return true
// }
// pub fn operation_to_base_object_method(op openapi.Operation) BaseObjectMethod {
// if op.operation_id.starts_with('update')
// }
// pub fn openapi_ts_client_body(op openapi.Operation, path string, method http.Method) string {
// match method {
// post {
// if schema := op.payload_schema() {
// symbol := schema_to_type(schema).typescript()
// return "return this.restClient.post<${symbol}>('${path}', data);"
// }
// }
// }
// return if operation_is_base_object_method(op) {
// bo_method := operation_to_base_object_method(op)
// match bo_method. {
// .new { ts_client_new_body(op, path) }
// .get { ts_client_get_body(op, path) }
// .set { ts_client_set_body(op, path) }
// .delete { ts_client_delete_body(op, path) }
// .list { ts_client_list_body(op, path) }
// else {''}
// }
// } else {''}
// }
fn get_endpoint(path string) string {
return if path == '' {
''
} else {
'/${path.trim('/')}'
}
}
// // generates a Base Object's `create` method
// fn ts_client_new_body(op Operation, path string) string {
// // the parameter of a base object new method is always the base object
// bo_param := openapi_codegen.parameter_to_param(op.parameters[0])
// return "return this.restClient.post<${bo_param.typ.typescript()}>('${get_endpoint(path)}', ${bo_param.name});"
// }
// // generates a Base Object's `create` method
// fn ts_client_get_body(op Operation, path string) string {
// // the parameter of a base object get method is always the id
// id_param := openapi_codegen.parameter_to_param(op.parameters[0])
// return "return this.restClient.get<${id_param.typ.typescript()}>('${get_endpoint(path)}', ${id_param.name});"
// }
// // generates a Base Object's `create` method
// fn ts_client_set_body(op Operation, path string) string {
// // the parameter of a base object set method is always the base object
// bo_param := openapi_codegen.parameter_to_param(op.parameters[0])
// return "return this.restClient.put<${bo_param.typ.typescript()}>('${get_endpoint(path)}', ${bo_param.name});"
// }
// // generates a Base Object's `delete` method
// fn ts_client_delete_body(op Operation, path string) string {
// // the parameter of a base object delete method is always the id
// id_param := openapi_codegen.parameter_to_param(op.parameters[0])
// return "return this.restClient.get<${id_param.typ.typescript()}>('${get_endpoint(path)}', ${id_param.name});"
// }
// // generates a Base Object's `list` method
// fn ts_client_list_body(op Operation, path string) string {
// // the result parameter of a base object list method is always the array of bo
// result_param := openapi_codegen.parameter_to_param(op.parameters[0])
// return "return this.restClient.get<${result_param.typ.typescript()}>('${get_endpoint(path)}');"
// }
// pub enum BaseObjectMethodType {
// new
// get
// set
// delete
// list
// other
// }
// pub struct BaseObjectMethod {
// pub:
// typ BaseObjectMethodType
// object string // the name of the base object
// }

View File

@@ -1,110 +0,0 @@
module generator
import json
import incubaid.herolib.develop.codetools as code { File, Function, Struct, VFile }
import incubaid.herolib.schemas.openrpc { OpenRPC }
import incubaid.herolib.schemas.openrpc.codegen { generate_client_file, generate_client_test_file }
pub fn generate_openrpc_file(spec OpenRPC) !File {
return File{
name: 'openrpc'
extension: 'json'
content: json.encode(spec)
}
}
pub fn generate_openrpc_client_file(spec OpenRPC) !VFile {
mut objects_map := map[string]Struct{}
// for object in spec.objects {
// objects_map[object.structure.name] = object.structure
// }
client_file := generate_client_file(spec, objects_map)!
return VFile{
...client_file
name: 'client_openrpc'
}
}
pub fn generate_openrpc_client_test_file(spec OpenRPC) !VFile {
mut objects_map := map[string]Struct{}
// for object in spec.objects {
// objects_map[object.structure.name] = object.structure
// }
mut methods_map := map[string]Function{}
// for method in spec.methods {
// methods_map[method.func.name] = method.func
// }
file := generate_client_test_file(spec, methods_map, objects_map)!
return VFile{
...file
name: 'client_openrpc_test'
}
}
// pub fn (actor Actor) generate_openrpc_code() !Module {
// openrpc_obj := actor.generate_openrpc()
// openrpc_json := openrpc_obj.encode()!
// openrpc_file := File{
// name: 'openrpc'
// extension: 'json'
// content: openrpc_json
// }
// mut methods_map := map[string]Function{}
// for method in actor.methods {
// methods_map[method.func.name] = method.func
// }
// mut objects_map := map[string]Struct{}
// for object in actor.objects {
// objects_map[object.structure.name] = object.structure
// }
// // actor_struct := generate_actor_struct(actor.name)
// actor_struct := actor.structure
// client_file := openrpc_obj.generate_client_file(objects_map)!
// client_test_file := openrpc_obj.generate_client_test_file(methods_map, objects_map)!
// handler_file := openrpc_obj.generate_handler_file(actor_struct, methods_map, objects_map)!
// handler_test_file := openrpc_obj.generate_handler_test_file(actor_struct, methods_map,
// objects_map)!
// server_file := openrpc_obj.generate_server_file()!
// server_test_file := openrpc_obj.generate_server_test_file()!
// return Module{
// files: [
// client_file,
// client_test_file,
// handler_file,
// handler_test_file,
// server_file,
// server_test_file,
// ]
// // misc_files: [openrpc_file]
// }
// }
// pub fn (mut a Actor) export_playground(path string, openrpc_path string) ! {
// dollar := '$'
// openrpc.export_playground(
// dest: pathlib.get_dir(path: '${path}/playground')!
// specs: [
// pathlib.get(openrpc_path),
// ]
// )!
// mut cli_file := pathlib.get_file(path: '${path}/command/cli.v')!
// cli_file.write($tmpl('./templates/playground.v.template'))!
// }
// pub fn param_to_content_descriptor(param Param) openrpc.ContentDescriptor {
// if param.name == 'id' && param.typ.symbol ==
// return openrpc.ContentDescriptor {
// name: param.name
// summary: param.description
// required: param.is_required()
// schema:
// }
// }

View File

@@ -1,38 +0,0 @@
module generator
import incubaid.herolib.develop.codetools as code { Function, Param, Result, Struct, Type }
import incubaid.herolib.schemas.openrpc
const test_actor_specification = ActorSpecification{
methods: [
ActorMethod{
func: Function{
name: 'get_object'
params: [
Param{
name: 'id'
typ: Type{
symbol: 'int'
}
},
]
result: Result{
typ: Type{
symbol: 'Object'
}
}
}
},
]
objects: [BaseObject{
structure: Struct{
name: 'Object'
}
}]
}
pub fn test_generate_openrpc() ! {
actor := Actor{}
object := generate_openrpc(actor)
panic(object.encode()!)
}

View File

@@ -1,75 +0,0 @@
module generator
import incubaid.herolib.develop.codetools as code { File, Folder }
import incubaid.herolib.core.texttools
// generates the folder with runnable scripts of the actor
pub fn generate_scripts_folder(name string, example bool) Folder {
actor_name := '${texttools.snake_case(name)}_actor'
return Folder{
name: 'scripts'
files: [
generate_run_script(actor_name),
generate_docs_script(actor_name),
generate_run_actor_script(name),
generate_run_actor_example_script(name),
generate_run_http_server_script(name),
// generate_compile_script(actor_name),
// generate_generate_script(actor_name)
]
}
}
// Function to generate a script for running an actor
fn generate_run_script(actor_name string) File {
actor_title := texttools.title_case(actor_name)
dollar := '$'
return File{
name: 'run'
extension: 'sh'
content: $tmpl('./templates/run.sh.template')
}
}
// Function to generate a script for running an actor
fn generate_docs_script(actor_name string) File {
dollar := '$'
return File{
name: 'docs'
extension: 'vsh'
content: $tmpl('./templates/docs.vsh.template')
}
}
// Function to generate a script for running an actor
fn generate_run_actor_script(name string) File {
name_snake := texttools.snake_case(name)
name_pascal := texttools.pascal_case(name)
return File{
name: 'run_actor'
extension: 'vsh'
content: $tmpl('./templates/run_actor.vsh.template')
}
}
// Function to generate a script for running an example actor
fn generate_run_actor_example_script(name string) File {
name_snake := texttools.snake_case(name)
name_pascal := texttools.pascal_case(name)
return File{
name: 'run_actor_example'
extension: 'vsh'
content: $tmpl('./templates/run_actor_example.vsh.template')
}
}
// Function to generate a script for running an HTTP server
fn generate_run_http_server_script(name string) File {
port := 8080
name_snake := texttools.snake_case(name)
return File{
name: 'run_http_server'
extension: 'vsh'
content: $tmpl('./templates/run_http_server.vsh.template')
}
}

View File

@@ -1,41 +0,0 @@
import os
import incubaid.herolib.baobab.stage
import incubaid.herolib.core.redisclient
import incubaid.herolib.schemas.openapi
import time
pub const configuration = stage.ActorConfig {
name: '@{name_snake}'
version: '@{version}'
}
@@[heap]
struct @{name_pascal}Actor {
stage.Actor
pub mut:
@{name_snake} I@{name_pascal}
}
pub fn new(core I@{name_pascal}, config stage.ActorConfig) !&@{name_pascal}Actor {
return &@{name_pascal}Actor {
Actor: stage.new_actor(config)!
@{name_snake}: core
}
}
pub fn (mut a @{name_pascal}Actor) handle(method string, data string) !string {
action := a.act(
name: method
params: data
)!
return action.result
}
// Actor listens to the Redis queue for method invocations
pub fn (mut a @{name_pascal}Actor) run() ! {
mut rpc := a.get_redis_rpc()!
for {
rpc.process(a.handle)!
time.sleep(time.millisecond * 100) // Prevent CPU spinning
}
}

View File

@@ -1,37 +0,0 @@
import os
import incubaid.herolib.baobab.stage
import incubaid.herolib.core.redisclient
import incubaid.herolib.schemas.openapi
const name = '@{actor_name_snake}'
@@[heap]
struct @{actor_name_pascal}Actor {
stage.Actor
}
pub fn new() !&@{actor_name_pascal}Actor {
return &@{actor_name_pascal}Actor {
Actor: stage.new_actor('example_@{actor_name_snake}')!
}
}
pub fn (mut a @{actor_name_pascal}Actor) handle(method string, data string) !string {
action := a.act(
name: method
params: data
)!
return action.result
}
// Actor listens to the Redis queue for method invocations
pub fn (mut a @{actor_name_pascal}Actor) run() ! {
mut redis := redisclient.new('localhost:6379') or { panic(err) }
mut rpc := redis.rpc_get('actor_@{dollar}{a.name}')
println('Actor started and listening for tasks...')
for {
rpc.process(a.handle)!
time.sleep(time.millisecond * 100) // Prevent CPU spinning
}
}

View File

@@ -1,10 +0,0 @@
const test_port = 8101
pub fn test_new() ! {
new() or { return error('Failed to create actor:\n@{dollar}{err}') }
}
pub fn test_actor_run() ! {
mut actor := new()!
spawn actor.run()
}

View File

@@ -1,63 +0,0 @@
module @{name}
import os
import cli { Command }
import veb
import incubaid.herolib.schemas.openrpc
import incubaid.herolib.core.pathlib
const openrpc_path = '@{dollar}{os.dir(os.dir(@@FILE))}/openrpc.json'
const playground_path = '@{dollar}{os.dir(os.dir(@@FILE))}/playground'
fn do() ! {
mut cmd := new_command()
cmd.setup()
cmd.parse(os.args)
}
pub fn new_command() Command {
mut cmd := Command{
name: '@{name}'
description: 'Your @{name} toolset.'
version: '1.0.16'
}
mut cmd_run := Command{
name: 'run_server'
description: 'Run @{name} websocket server.'
usage: ''
required_args: 0
execute: cmd_run_wsserver
}
mut cmd_playground := Command{
name: 'playground'
description: 'Run @{name} playground server.'
usage: ''
required_args: 0
execute: playground
}
cmd.add_command(cmd_run)
cmd.add_command(cmd_playground)
return cmd
}
fn cmd_run_wsserver(cmd Command) ! {
// accountant.run_wsserver(3000)!
}
fn playground(cmd Command) ! {
pg := openrpc.new_playground(
dest: pathlib.get_dir(path: playground_path)!
specs: [pathlib.get_file(path:openrpc_path)!]
)!
veb.run(pg, 8080)
}
fn main() {
do() or { panic(err) }
}

View File

@@ -1,74 +0,0 @@
module pet_store_actor
import incubaid.herolib.baobab.stage
import incubaid.herolib.core.redisclient
import x.json2 as json
import time
fn mock_response() ! {
mut redis := redisclient.new('localhost:6379')!
mut rpc_q := redis.rpc_get('actor_pet_store')
for {
rpc_q.process(fn (method string, data string) !string {
return json.encode(method)
})!
time.sleep(time.millisecond * 100) // Prevent CPU spinning
}
}
fn test_list_pets() ! {
mut client := new_client()!
limit := 10
spawn mock_response()
pets := client.list_pets(limit)!
// assert pets.len <= limit
println('test_list_pets passed')
}
fn test_create_pet() ! {
mut client := new_client()!
client.create_pet()!
println('test_create_pet passed')
}
fn test_get_pet() ! {
mut client := new_client()!
pet_id := 1 // Replace with an actual pet ID in your system
pet := client.get_pet(pet_id)!
// assert pet.id == pet_id
println('test_get_pet passed')
}
fn test_delete_pet() ! {
mut client := new_client()!
pet_id := 1 // Replace with an actual pet ID in your system
client.delete_pet(pet_id)!
println('test_delete_pet passed')
}
fn test_list_orders() ! {
mut client := new_client()!
client.list_orders()!
println('test_list_orders passed')
}
fn test_get_order() ! {
mut client := new_client()!
order_id := 1 // Replace with an actual order ID in your system
order := client.get_order(order_id)!
// assert order.id == order_id
println('test_get_order passed')
}
fn test_delete_order() ! {
mut client := new_client()!
order_id := 1 // Replace with an actual order ID in your system
client.delete_order(order_id)!
println('test_delete_order passed')
}
fn test_create_user() ! {
mut client := new_client()!
client.create_user()!
println('test_create_user passed')
}

View File

@@ -1,81 +0,0 @@
import incubaid.herolib.core.pathlib
import cli { Command, Flag }
import os
import incubaid.herolib.ui.console
pub fn cmd_example_actor() Command {
mut cmd := Command{
name: 'example_actor'
usage: ''
description: 'create, edit, show mdbooks'
required_args: 0
execute: cmd_example_actor_execute
}
mut cmd_list := Command{
sort_flags: true
name: 'list_books'
execute: cmd_publisher_list_books
description: 'will list existing mdbooks'
pre_execute: pre_func
}
mut cmd_open := Command{
name: 'open'
execute: cmd_publisher_open
description: 'will open the publication with the provided name'
pre_execute: pre_func
}
cmd_open.add_flag(Flag{
flag: .string
name: 'name'
abbrev: 'n'
description: 'name of the publication.'
})
cmd.add_command(cmd_list)
cmd.add_command(cmd_open)
return cmd
}
fn cmd_publisher_list_books(cmd Command) ! {
console.print_header('Books:')
books := publisher.list_books()!
for book in books {
console.print_stdout(book.str())
}
}
fn cmd_publisher_open(cmd Command) ! {
name := cmd.flags.get_string('name') or { '' }
publisher.open(name)!
}
fn cmd_execute(cmd Command) ! {
mut name := cmd.flags.get_string('name') or { '' }
if name == '' {
console.print_debug('did not find name of book to generate, check in heroscript or specify with --name')
publisher_help(cmd)
exit(1)
}
edit := cmd.flags.get_bool('edit') or { false }
open := cmd.flags.get_bool('open') or { false }
if edit || open {
// mdbook.book_open(name)!
}
if edit {
// publisher.book_edit(name)!
}
}
fn publisher_help(cmd Command) {
console.clear()
console.print_header('Instructions for example actor:')
console.print_lf(1)
console.print_stdout(cmd.help_message())
console.print_lf(5)
}

View File

@@ -1,51 +0,0 @@
#!/usr/bin/env -S v -n -w -gc none -no-retry-compilation -cc tcc -d use_openssl -enable-globals run
import os
abs_dir_of_script := dir(@@FILE)
// Format code
println('Formatting code...')
if os.system('v fmt -w @{dollar}{abs_dir_of_script}/examples') != 0 {
eprintln('Warning: Failed to format examples')
}
if os.system('v fmt -w @{dollar}{abs_dir_of_script}/src') != 0 {
eprintln('Warning: Failed to format actor')
}
// Clean existing docs
println('Cleaning existing documentation...')
os.rmdir_all('_docs') or {}
os.rmdir_all('docs') or {}
os.rmdir_all('vdocs') or {}
herolib_path := os.join_path(abs_dir_of_script, 'lib')
os.chdir(herolib_path) or {
panic('Failed to change directory to herolib: @{dollar}{err}')
}
os.rmdir_all('_docs') or {}
os.rmdir_all('docs') or {}
os.rmdir_all('vdocs') or {}
// Generate HTML documentation
println('Generating HTML documentation...')
if os.system('v doc -m -f html . -readme -comments -no-timestamp -o ../docs') != 0 {
panic('Failed to generate HTML documentation')
}
os.chdir(abs_dir_of_script) or {
panic('Failed to change directory to abs_dir_of_script: @{dollar}{err}')
}
// Generate Markdown documentation
println('Generating Markdown documentation...')
os.rmdir_all('vdocs') or {}
if os.system('v doc -m -no-color -f md -o vdocs/herolib/') != 0 {
panic('Failed to generate Hero markdown documentation')
}
println('Documentation generation completed successfully!')

View File

@@ -1,41 +0,0 @@
import incubaid.herolib.schemas.openapi { OpenAPI }
import incubaid.herolib.baobab.stage {Client, ClientConfig}
import incubaid.herolib.schemas.openrpc { OpenRPC }
import incubaid.herolib.baobab.stage.interfaces { HTTPServer, Context }
import veb
@@[params]
pub struct HTTPServerParams {
pub:
base_url string
port int = 8080
}
pub fn new_http_server(params HTTPServerParams) !&HTTPServer {
mut s := interfaces.new_http_server()!
@if openrpc
mut openrpc_controller := new_openrpc_http_controller(HTTPServerParams{
...params,
base_url: '@{dollar}{params.base_url}/openrpc'
})!
s.register_controller[openrpc.HTTPController, Context]('/openrpc', mut openrpc_controller)!
@end
@if openapi
mut openapi_ctrl := new_openapi_http_controller(configuration, params)!
mut openapi_ex_ctrl := new_openapi_http_controller(configuration.example().example(), params)!
mut openapi_playground_controller := openapi.new_playground_controller(
base_url: '@{dollar}{params.base_url}/playground/openapi'
specification_path: openapi_spec_path
)!
s.register_controller[openapi.HTTPController, Context]('/openapi/v1', mut openapi_ctrl)!
s.register_controller[openapi.HTTPController, Context]('/openapi/example', mut openapi_ex_ctrl)!
s.register_controller[openapi.PlaygroundController, Context]('/playground/openapi', mut openapi_playground_controller)!
@end
return s
}
pub fn run_http_server(params HTTPServerParams) ! {
mut server := new_http_server(params)!
veb.run[HTTPServer, Context](mut server, params.port)
}

View File

@@ -1,7 +0,0 @@
fn test_new_http_server() ! {
new_http_server()!
}
fn test_run_http_server() ! {
spawn run_http_server()
}

View File

@@ -1,22 +0,0 @@
import incubaid.herolib.baobab.stage.interfaces
import incubaid.herolib.baobab.stage
import incubaid.herolib.schemas.openapi
pub fn new_openapi_interface(config stage.ActorConfig) !&interfaces.OpenAPIInterface {
// create OpenAPI Handler with actor's client
client := new_client(config)!
return interfaces.new_openapi_interface(client.Client)
}
@if http
// creates HTTP controller with the actor's OpenAPI Handler
// and OpenAPI Specification
pub fn new_openapi_http_controller(config stage.ActorConfig, params HTTPServerParams) !&openapi.HTTPController {
return openapi.new_http_controller(
base_url: '@{dollar}{params.base_url}/openapi/@{dollar}{config.version}'
specification: openapi_specification
specification_path: openapi_spec_path
handler: new_openapi_interface(config)!
)
}
@end

View File

@@ -1,9 +0,0 @@
fn test_new_openapi_interface() ! {
new_openapi_interface()!
}
@if http
fn test_new_openapi_http_controller() ! {
new_openapi_http_controller()!
}
@end

View File

@@ -1,19 +0,0 @@
import incubaid.herolib.baobab.stage.interfaces
import incubaid.herolib.schemas.openrpc
pub fn new_openrpc_interface() !&interfaces.OpenRPCInterface {
// create OpenRPC Handler with actor's client
client := new_client()!
return interfaces.new_openrpc_interface(client.Client)
}
@if http
// creates HTTP controller with the actor's OpenRPC Handler
// and OpenRPC Specification
pub fn new_openrpc_http_controller(params ServerParams) !&openrpc.HTTPController {
return openrpc.new_http_controller(
specification: openrpc_specification
handler: new_openrpc_interface()!
)
}
@end

View File

@@ -1,9 +0,0 @@
fn test_new_openrpc_interface() ! {
new_openrpc_interface()!
}
@if http
fn test_new_openrpc_http_controller() ! {
new_openrpc_http_controller()!
}
@end

View File

@@ -1,16 +0,0 @@
#!/usr/bin/env -S v -n -cg -w -enable-globals run
import incubaid.herolib.baobab.stages.accountant
import veb
import incubaid.herolib.schemas.openrpc
import os
import incubaid.herolib.core.pathlib
const openrpc_path = '@{dollar}{os.dir(os.dir(@@FILE))}/openrpc.json'
const playground_path = '@{dollar}{os.dir(os.dir(@@FILE))}/playground'
pg := openrpc.new_playground(
dest: pathlib.get_dir(path: playground_path)!
specs: [pathlib.get_file(path:openrpc_path)!]
)!
veb.run(pg, 8080)

View File

@@ -1,42 +0,0 @@
#!/bin/bash -ex
DIR="@{dollar}(cd "@{dollar}(dirname "@{dollar}{BASH_SOURCE[0]}")" && pwd)"
echo "@{dollar}DIR"
chmod +x @{dollar}{DIR}/run_actor.vsh
@{dollar}{DIR}/run_actor.vsh &
ACTOR_PID=@{dollar}!
chmod +x @{dollar}{DIR}/run_actor_example.vsh
@{dollar}{DIR}/run_actor_example.vsh &
EXAMPLE_ACTOR_PID=@{dollar}!
chmod +x @{dollar}{DIR}/run_http_server.vsh
@{dollar}{DIR}/run_http_server.vsh &
HTTP_SERVER_PID=@{dollar}!
# Print desired output
echo "${actor_title} Actor Redis Interface running on redis://localhost:6379"
echo "* /queues/${actor_name} -> Action Interface"
echo ""
echo "${actor_title} Actor HTTP Server running on http://localhost:8080"
echo "* http://localhost:8080/playground/openapi -> OpenAPI Playground"
echo "* http://localhost:8080/openapi -> OpenAPI Interface"
# echo "* http://localhost:8080/docs -> Documentation"
echo ""
# Function to clean up when script is killed
cleanup() {
echo "Stopping background processes..."
kill "@{dollar}ACTOR_PID" "@{dollar}HTTP_SERVER_PID" 2>/dev/null
wait
echo "All processes stopped."
exit 0
}
# Trap SIGINT (Ctrl+C), SIGTERM, and SIGQUIT to call cleanup
trap cleanup SIGINT SIGTERM SIGQUIT
# Wait for processes to finish
wait

View File

@@ -1,10 +0,0 @@
#!/usr/bin/env -S v -w -n -enable-globals run
import @{name_snake}
mut actor := @{name_snake}.new(
@{name_snake}.new_@{name_snake}()!,
@{name_snake}.configuration
)!
actor.run()!

View File

@@ -1,10 +0,0 @@
#!/usr/bin/env -S v -w -n -enable-globals run
import @{name_snake}
mut actor := @{name_snake}.new(
@{name_snake}.new_@{name_snake}_example()!,
@{name_snake}.configuration.example()
)!
actor.run()!

View File

@@ -1,7 +0,0 @@
#!/usr/bin/env -S v -w -n -enable-globals run
import @{name_snake}
@{name_snake}.run_http_server(
base_url: 'http://localhost:@{port}'
port: @{port}
)!

View File

@@ -1,14 +0,0 @@
import incubaid.herolib.schemas.openapi
import incubaid.herolib.schemas.openrpc
import os
@if support_openrpc
const openrpc_spec_path = '@{dollar}{os.dir(@@FILE)}/docs/specs/openrpc.json'
const openrpc_spec_json = os.read_file(openrpc_spec_path) or { panic(err) }
const openrpc_specification = openrpc.decode(openrpc_spec_json)!
@end
@if support_openapi
const openapi_spec_path = '@{dollar}{os.dir(os.dir(@@FILE))}/docs/specs/openapi.json'
const openapi_spec_json = os.read_file(openapi_spec_path) or { panic(err) }
const openapi_specification = openapi.json_decode(openapi_spec_json)!
@end

View File

@@ -1 +0,0 @@
pet_store_actor

View File

@@ -1,139 +0,0 @@
module generator
import incubaid.herolib.develop.codetools as code
import os
// // generate_object_methods generates CRUD actor methods for a provided structure
// pub fn (generator ActorGenerator) generate_object_methods(structure code.Struct) []code.Function {
// return [
// generator.generate_get_method(structure),
// // generator.generate_set_method(structure),
// // generator.generate_delete_method(structure),
// // generator.generate_get_method(structure),
// ]
// }
// generate_object_methods generates CRUD actor methods for a provided structure
pub fn test_generate_get_method() {
generator := ActorGenerator{'test'}
actor_struct := code.Struct{
name: 'TestActor'
fields: [
code.StructField{
name: 'test_struct_map'
typ: code.Type{
symbol: 'map[string]&TestStruct'
}
},
]
}
test_struct := code.Struct{
name: 'TestStruct'
}
field := get_child_field(
parent: actor_struct
child: test_struct
)
method := generator.generate_get_method(
actor_name: actor_struct.name
actor_field: field
root_struct: test_struct
)
}
// // generate_object_methods generates CRUD actor methods for a provided structure
// pub fn (generator ActorGenerator) generate_set_method(structure code.Struct) code.Function {
// params_getter := "id := params.get('id')!"
// field := generator.get_object_field(structure)
// object_getter := 'object := actor.${field.name}[id]'
// body := '${params_getter}\n${object_getter}\nreturn object'
// get_method := code.Function{
// name: 'get_${generator.model_name}'
// description: 'gets the ${structure.name} with the given object id'
// receiver: code.Param{
// name: 'actor'
// struct_: generator.actor_struct
// }
// params: [
// code.Param{
// name: 'id'
// typ: code.Type{
// symbol: 'string'
// }
// },
// ]
// result: code.Result{
// structure: structure
// }
// body: body
// }
// return get_method
// }
// // generate_object_methods generates CRUD actor methods for a provided structure
// pub fn (generator ActorGenerator) generate_get_method(structure code.Struct) code.Function {
// params_getter := "id := params.get('id')!"
// field := generator.get_object_field(structure)
// object_getter := 'object := actor.${field.name}[id]'
// body := '${params_getter}\n${object_getter}\nreturn object'
// get_method := code.Function{
// name: 'get_${generator.model_name}'
// description: 'gets the ${structure.name} with the given object id'
// receiver: code.Param{
// name: 'actor'
// struct_: generator.actor_struct
// }
// params: [
// code.Param{
// name: 'id'
// typ: code.Type{
// symbol: 'string'
// }
// },
// ]
// result: code.Result{
// structure: structure
// }
// body: body
// }
// return get_method
// }
// // generate_object_methods generates CRUD actor methods for a provided structure
// pub fn (generator ActorGenerator) generate_delete_method(structure code.Struct) code.Function {
// params_getter := "id := params.get('id')!"
// field := generator.get_object_field(structure)
// object_getter := 'object := actor.${field.name}[id]'
// body := '${params_getter}\n${object_getter}\nreturn object'
// get_method := code.Function{
// name: 'get_${generator.model_name}'
// description: 'gets the ${structure.name} with the given object id'
// receiver: code.Param{
// name: 'actor'
// struct_: generator.actor_struct
// }
// params: [
// code.Param{
// name: 'id'
// typ: code.Type{
// symbol: 'string'
// }
// },
// ]
// result: code.Result{
// structure: structure
// }
// body: body
// }
// return get_method
// }
// pub fn (generator ActorGenerator) get_object_field(structure code.Struct) code.StructField {
// fields := generator.actor_struct.fields.filter(it.typ.symbol == 'map[string]&${structure.name}')
// if fields.len != 1 {
// panic('this should never happen')
// }
// return fields[0]
// }

View File

@@ -1,33 +0,0 @@
# OSIS
Object Storage and Indexing System
A system for storing root objects efficiently and indexed by certain fields.
OSIS comprises of 2 elements:
- Indexer: responsible for indexing and identifying objects
- Storer: responsible of storing in different databases, with varying encodings, and encryption.
## Indexer
The indexers primary duty is to be able to create and query sql tables for a given base object specification and it's indices. For instance: I specify a Base Object called Pet, and I specify Pet so that (more on writing specifications here) it's `breed` tag is indexable.
```
struct Pet {
breed string @[index]
}
```
Given this specification, the indexer is expected to create an sql table with the breed field as a column. This allows the backend to filter and search base objects by their fields. Note that, the object isn't stored on the table, but just it's id. Object storage and modification is handled by the
## Getting started
## Generic Code
The solution provided by this module is to create a backend interface with generic CRUD + list + filter methods for root objects that different backends can implement.
This allows for a single generated actor code to use different backends, without having to generate separate code for each. Having less generated code is less prone to errors, and using the same backend methods for each actor makes it easier modify, fix and add features to the backends. Using the same data manipulation methods in generated code also makes it easier to generate code for the actor as the implementations don't differ for different root objects.
### Creating a backend

View File

@@ -1,8 +0,0 @@
module osis
pub fn new(config OSISConfig) !OSIS {
return OSIS{
indexer: new_indexer()!
storer: new_storer()!
}
}

View File

@@ -1,116 +0,0 @@
module osis
import json
import db.sqlite
import incubaid.herolib.core.texttools
import incubaid.herolib.core.pathlib
pub struct Indexer {
db sqlite.DB
}
@[params]
pub struct IndexerConfig {
db_path string
reset bool
}
pub fn new_indexer(config IndexerConfig) !Indexer {
return Indexer{}
}
// deletes an indexer table belonging to a base object
pub fn reset(path string) ! {
mut db_file := pathlib.get_file(path: path)!
db_file.delete()!
}
pub fn (mut i Indexer) new_generic[T](id u32, object T) !u32 {
return i.new(get_table[T](), id, get_indices[T](object))!
}
// new creates a new root object entry in the root_objects table,
// and the table belonging to the type of root object with columns for index fields
pub fn (mut i Indexer) new(table string, id u32, indices map[string]string) !u32 {
insert_query := 'INSERT into ${table} (${indices.keys().join(',')}) values (${indices.values().join(',')})'
i.db.exec(insert_query) or {
return error('Error inserting object ${id} into table ${table}\n${err}')
}
return 0
}
// save the session to redis & mem
pub fn (mut backend Indexer) set(obj RootObject) ! {
panic('implement')
}
// save the session to redis & mem
pub fn (mut backend Indexer) delete(id string, obj RootObject) ! {
panic('implement')
}
pub fn (mut backend Indexer) get(id string, obj RootObject) !RootObject {
panic('implement')
}
pub fn (mut backend Indexer) get_json(id string, obj RootObject) !string {
panic('implement')
}
pub fn (mut backend Indexer) list(obj RootObject) ![]u32 {
panic('implement')
}
// from and to for int f64 time etc.
@[params]
pub struct FilterParams {
// indices map[string]string // map of index values that are being filtered by, in order of priority.
limit int // limit to the number of values to be returned, in order of priority
fuzzy bool // if fuzzy matching is enabled in matching indices
matches_all bool // if results should match all indices or any
}
// filter lists root objects of type T that match provided index parameters and params.
pub fn (mut backend Indexer) filter(filter RootObject, params FilterParams) ![]string {
panic('implement')
}
// create_root_struct_table creates a table for a root_struct with columns for each index field
fn (mut backend Indexer) create_root_object_table(object RootObject) ! {
panic('implement')
}
// deletes an indexer table belonging to a root object
fn (mut backend Indexer) delete_table(object RootObject) ! {
panic('implement')
}
fn (mut backend Indexer) get_table_indices(table_name string) ![]string {
panic('implement')
}
fn (mut backend Indexer) table_exists(table_name string) !bool {
panic('implement')
}
// get_table_name returns the name of the table belonging to a root struct
fn get_table_name(object RootObject) string {
panic('implement')
}
// get_table_name returns the name of the table belonging to a root struct
fn get_table[T]() string {
return typeof[T]()
}
// returns the lists of the indices of a root objects db table, and corresponding values
pub fn get_indices[T](object T) map[string]string {
mut indices := map[string]string{}
$for field in T.fields {
if field.attrs.contains('index') {
value := object.$(field.name)
indices[field.name] = '${value}'
}
}
return indices
}

View File

@@ -1,16 +0,0 @@
module osis
pub struct OSIS {
pub mut:
indexer Indexer // storing indeces
storer Storer
}
@[params]
pub struct OSISConfig {
pub:
directory string
name string
secret string
reset bool
}

View File

@@ -1,58 +0,0 @@
module osis
import os
pub fn (mut o OSIS) generic_new[T](obj T) !u32 {
id := o.indexer.generic_new[T](obj)!
o.storer.generic_new[T](obj)!
return id
}
pub fn (mut o OSIS) new[T](obj T) !u32 {
id := o.storer.new_generic[T](obj)!
o.indexer.new_generic[T](id, obj)!
return id
}
pub fn (mut o OSIS) generic_get[T](id u32) !T {
return o.storer.generic_get[T](id)!
}
pub fn (mut o OSIS) get[T](id u32) !T {
return o.storer.generic_get[T](u32(id))!
}
pub fn (mut o OSIS) generic_set[T](obj T) ! {
o.indexer.generic_set[T](obj) or { return error('Failed to set new indices:\n${err}') }
o.storer.generic_set[T](obj)!
}
pub fn (mut o OSIS) generic_delete[T](id u32) ! {
o.indexer.generic_delete[T](id)!
o.storer.generic_delete[T](id)!
}
pub fn (mut o OSIS) delete(id u32) ! {
o.storer.delete(u32(id))!
}
pub fn (mut o OSIS) list[T]() ![]T {
panic('implement')
// ids := o.indexer.generic_list[T]()!
// return o.storer.generic_list[T](ids)!
}
pub fn (mut o OSIS) generic_list[T]() ![]T {
ids := o.indexer.generic_list[T]()!
return o.storer.generic_list[T](ids)!
}
pub fn (mut o OSIS) generic_filter[T, D](filter D, params FilterParams) ![]T {
ids := o.indexer.generic_filter[T, D](filter, params)!
return o.storer.generic_list[T](ids)!
}
pub fn (mut o OSIS) generic_reset[T]() ! {
o.indexer.generic_reset[T]()!
o.storer.generic_reset[T]()!
}

View File

@@ -1,135 +0,0 @@
module osis
import x.json2
// describes a root object
pub struct RootObject {
pub mut:
id string
name string // Story
fields []FieldDescription
}
pub struct FieldDescription {
pub mut:
name string // name of field
typ FieldType
value string // value of field
is_secret bool // whether field should be encrypted upon storage
is_index bool // whether object is searchable by field
fts_enabled bool // whether full text search on field is enabled
}
// returns the sql type name of the field
pub fn (field FieldDescription) sql_type() string {
return match field.typ {
.text { 'TEXT' }
.number { 'INTEGER' }
}
}
pub enum FieldType {
number
text
}
pub fn (obj RootObject) to_json() string {
mut obj_map := map[string]json2.Any{}
for field in obj.fields {
obj_map[field.name] = field.value
}
return obj_map.str()
}
// returns the lists of the indices of a root objects db table, and corresponding values
pub fn (obj RootObject) sql_indices_values() ([]string, []string) {
obj_encoded := obj.to_json()
obj_val := "'${obj_encoded.replace("'", "''")}'"
// insert root object into its table
mut indices := ['data']
mut values := [obj_val]
for field in obj.fields {
if field.name == 'id' {
indices << '${field.name}'
values << '${field.value}'
}
if field.typ == .text {
if field.is_index {
indices << '${field.name}'
values << "'${field.value}'"
}
} else if field.typ == .number {
if field.is_index {
indices << '${field.name}'
values << '${field.value}'
}
}
}
println('debugzoni ${indices} ${values}')
return indices, values
}
// return the description of a given generic
pub fn root_object[T](object T) RootObject {
mut fields := []FieldDescription{}
$for field in T.fields {
mut typ := FieldType{}
$if field.typ is string {
typ = .text
} $else $if field.typ is int {
typ = .number
}
fields << FieldDescription{
name: field.name
typ: typ
value: object.$(field.name).str()
is_index: field.attrs.contains('index')
is_secret: field.attrs.contains('secret')
fts_enabled: field.attrs.contains('fts_enabled')
}
}
return RootObject{
name: typeof[T]()
fields: fields
}
}
// decodes root object into generic struct T
pub fn (object RootObject) to_generic[T]() T {
mut t := T{}
$for field in T.fields {
field_descrs := object.fields.filter(it.name == field.name)
if field_descrs.len == 1 {
$if field.typ is int {
t.$(field.name) = field_descrs[0].value.int()
} $else $if field.is_enum {
t.$(field.name) = field_descrs[0].value.int()
} $else {
t.$(field.name) = field_descrs[0].value
}
}
}
return t
}
pub fn root_object_from_json(json string) !RootObject {
raw_decode := json2.raw_decode(json)!
obj_map := raw_decode.as_map()
mut obj := RootObject{}
for key, val in obj_map {
obj.fields << FieldDescription{
name: key
value: val.str()
}
}
return obj
}

View File

@@ -1,15 +0,0 @@
module osis
import incubaid.herolib.data.ourdb { OurDB }
import os
pub struct Storer {
pub mut:
db OurDB
}
pub fn new_storer() !Storer {
return Storer{
db: ourdb.new()!
}
}

View File

@@ -1,23 +0,0 @@
module osis
import json
// new creates a new root object entry in the root_objects table,
// and the table belonging to the type of root object with columns for index fields
pub fn (mut storer Storer) new_generic[T](obj T) !u32 {
data := json.encode(obj).bytes()
return storer.db.set(data: data)
}
pub fn (mut storer Storer) generic_get[T](id u32) !T {
return json.decode(T, storer.db.get(id)!.bytestr())
}
pub fn (mut storer Storer) generic_set[T](obj T) ! {
data := json.encode(obj).bytes()
return storer.db.set(data: data)
}
pub fn (mut storer Storer) delete(id u32) ! {
storer.db.delete(id)!
}

View File

@@ -1,180 +0,0 @@
module specification
import incubaid.herolib.core.texttools
import incubaid.herolib.develop.codetools as code { Struct }
import incubaid.herolib.schemas.jsonschema { Schema, SchemaRef }
import incubaid.herolib.schemas.openapi { MediaType, OpenAPI, OperationInfo, Parameter }
import incubaid.herolib.schemas.openrpc { ContentDescriptor, ErrorSpec, Example, ExamplePairing, ExampleRef }
// Helper function: Convert OpenAPI parameter to ContentDescriptor
fn openapi_param_to_content_descriptor(param Parameter) ContentDescriptor {
return ContentDescriptor{
name: param.name
summary: param.description
description: param.description
required: param.required
schema: param.schema
}
}
// Helper function: Convert OpenAPI parameter to ContentDescriptor
fn openapi_param_to_example(param Parameter) ?Example {
if param.schema is Schema {
if param.schema.example.str() != '' {
return Example{
name: 'Example ${param.name}'
description: 'Example ${param.description}'
value: param.schema.example
}
}
}
return none
}
// Helper function: Convert OpenAPI operation to ActorMethod
fn openapi_operation_to_actor_method(info OperationInfo) ActorMethod {
mut parameters := []ContentDescriptor{}
mut example_parameters := []Example{}
for param in info.operation.parameters {
parameters << openapi_param_to_content_descriptor(param)
example_parameters << openapi_param_to_example(param) or { continue }
}
if schema_ := info.operation.payload_schema() {
// TODO: document assumption
schema := Schema{
...schema_
title: texttools.pascal_case(info.operation.operation_id)
}
parameters << ContentDescriptor{
name: 'data'
schema: SchemaRef(schema)
}
}
mut success_responses := map[string]MediaType{}
for code, response in info.operation.responses {
if code.starts_with('2') { // Matches all 2xx responses
success_responses[code] = response.content['application/json']
}
}
if success_responses.len > 1 || success_responses.len == 0 {
panic('Actor specification must specify one successful response.')
}
response_success := success_responses.values()[0]
mut result := ContentDescriptor{
name: 'result'
description: 'The response of the operation.'
required: true
schema: response_success.schema
}
example_result := if response_success.example.str() != '' {
Example{
name: 'Example response'
value: response_success.example
}
} else {
Example{}
}
pairing := if example_result != Example{} || example_parameters.len > 0 {
ExamplePairing{
params: example_parameters.map(ExampleRef(it))
result: ExampleRef(example_result)
}
} else {
ExamplePairing{}
}
mut errors := []ErrorSpec{}
for status, response in info.operation.responses {
if status.int() >= 400 {
error_schema := if response.content.len > 0 {
response.content.values()[0].schema
} else {
Schema{}
}
errors << ErrorSpec{
code: status.int()
message: response.description
data: error_schema // Extend if error schema is defined
}
}
}
return ActorMethod{
name: info.operation.operation_id
description: info.operation.description
summary: info.operation.summary
parameters: parameters
example: pairing
result: result
errors: errors
}
}
// Helper function: Convert OpenAPI schema to Struct
fn openapi_schema_to_struct(name string, schema SchemaRef) Struct {
// Assuming schema properties can be mapped to Struct fields
return Struct{
name: name
}
}
// Converts OpenAPI to ActorSpecification
pub fn from_openapi(spec_raw OpenAPI) !ActorSpecification {
spec := openapi.process(spec_raw)!
mut objects := []BaseObject{}
// get all operations for path as list of tuple [](path_string, http.Method, openapi.Operation)
// Extract methods from OpenAPI paths
// for path, item in spec.paths {
// if item.get.operation_id != '' {
// methods << openapi_operation_to_actor_method(item.get, item.get.operation_id, path)
// }
// if item.post.operation_id != '' {
// methods << openapi_operation_to_actor_method(item.post, item.post.operation_id, path)
// }
// if item.put.operation_id != '' {
// methods << openapi_operation_to_actor_method(item.put, item.put.operation_id, path)
// }
// if item.delete.operation_id != '' {
// methods << openapi_operation_to_actor_method(item.delete, item.delete.operation_id, path)
// }
// if item.patch.operation_id != '' {
// methods << openapi_operation_to_actor_method(item.patch, item.patch.operation_id, path)
// }
// if item.head.operation_id != '' {
// methods << openapi_operation_to_actor_method(item.head, item.head.operation_id, path)
// }
// if item.options.operation_id != '' {
// methods << openapi_operation_to_actor_method(item.options, item.options.operation_id, path)
// }
// if item.trace.operation_id != '' {
// methods << openapi_operation_to_actor_method(item.trace, item.trace.operation_id, path)
// }
// }
// Extract objects from OpenAPI components.schemas
for name, schema in spec.components.schemas {
objects << BaseObject{
schema: schema as Schema
}
}
return ActorSpecification{
openapi: spec_raw
name: spec.info.title
description: spec.info.description
structure: Struct{} // Assuming no top-level structure for this use case
interfaces: [.openapi] // Default to OpenAPI for input
methods: spec.get_operations().map(openapi_operation_to_actor_method(it))
objects: objects
}
}

View File

@@ -1,400 +0,0 @@
module specification
import x.json2 as json
import incubaid.herolib.develop.codetools as code { Struct }
import incubaid.herolib.schemas.openrpc { ContentDescriptor, ErrorSpec }
import incubaid.herolib.schemas.openapi { Components, Info, OpenAPI, Operation, PathItem, ServerSpec }
import incubaid.herolib.schemas.jsonschema { Reference, Schema, SchemaRef }
const openapi_spec = OpenAPI{
openapi: '3.0.3'
info: Info{
title: 'Pet Store API'
description: 'A sample API for a pet store'
version: '1.0.0'
}
servers: [
ServerSpec{
url: 'https://api.petstore.example.com/v1'
description: 'Production server'
},
ServerSpec{
url: 'https://staging.petstore.example.com/v1'
description: 'Staging server'
},
]
paths: {
'/pets': PathItem{
get: Operation{
summary: 'List all pets'
operation_id: 'listPets'
parameters: [
openapi.Parameter{
name: 'limit'
in_: 'query'
description: 'Maximum number of pets to return'
required: false
schema: Schema{
typ: 'integer'
format: 'int32'
example: 10
}
},
]
responses: {
'200': openapi.ResponseSpec{
description: 'A paginated list of pets'
content: {
'application/json': openapi.MediaType{
schema: Reference{
ref: '#/components/schemas/Pets'
}
example: json.raw_decode('[
{ "id": 1, "name": "Fluffy", "tag": "dog" },
{ "id": 2, "name": "Whiskers", "tag": "cat" }
]')!
}
}
}
'400': openapi.ResponseSpec{
description: 'Invalid request'
}
}
}
post: Operation{
summary: 'Create a new pet'
operation_id: 'createPet'
request_body: openapi.RequestBody{
required: true
content: {
'application/json': openapi.MediaType{
schema: Reference{
ref: '#/components/schemas/NewPet'
}
example: json.raw_decode('{ "name": "Bella", "tag": "dog" }')!
}
}
}
responses: {
'201': openapi.ResponseSpec{
description: 'Pet created'
content: {
'application/json': openapi.MediaType{
schema: Reference{
ref: '#/components/schemas/Pet'
}
example: json.raw_decode('{ "id": 3, "name": "Bella", "tag": "dog" }')!
}
}
}
'400': openapi.ResponseSpec{
description: 'Invalid input'
}
}
}
}
'/pets/{petId}': PathItem{
get: Operation{
summary: 'Get a pet by ID'
operation_id: 'getPet'
parameters: [
openapi.Parameter{
name: 'petId'
in_: 'path'
description: 'ID of the pet to retrieve'
required: true
schema: Schema{
typ: 'integer'
format: 'int64'
example: 1
}
},
]
responses: {
'200': openapi.ResponseSpec{
description: 'A pet'
content: {
'application/json': openapi.MediaType{
schema: Reference{
ref: '#/components/schemas/Pet'
}
example: json.raw_decode('{ "id": 1, "name": "Fluffy", "tag": "dog" }')!
}
}
}
'404': openapi.ResponseSpec{
description: 'Pet not found'
}
}
}
delete: Operation{
summary: 'Delete a pet by ID'
operation_id: 'deletePet'
parameters: [
openapi.Parameter{
name: 'petId'
in_: 'path'
description: 'ID of the pet to delete'
required: true
schema: Schema{
typ: 'integer'
format: 'int64'
example: 1
}
},
]
responses: {
'204': openapi.ResponseSpec{
description: 'Pet deleted'
}
'404': openapi.ResponseSpec{
description: 'Pet not found'
}
}
}
}
}
components: Components{
schemas: {
'Pet': SchemaRef(Schema{
typ: 'object'
required: ['id', 'name']
properties: {
'id': SchemaRef(Schema{
typ: 'integer'
format: 'int64'
})
'name': SchemaRef(Schema{
typ: 'string'
})
'tag': SchemaRef(Schema{
typ: 'string'
})
}
})
'NewPet': SchemaRef(Schema{
typ: 'object'
required: ['name']
properties: {
'name': SchemaRef(Schema{
typ: 'string'
})
'tag': SchemaRef(Schema{
typ: 'string'
})
}
})
'Pets': SchemaRef(Schema{
typ: 'array'
items: SchemaRef(Reference{
ref: '#/components/schemas/Pet'
})
})
}
}
}
const actor_spec = ActorSpecification{
name: 'Pet Store API'
description: 'A sample API for a pet store'
structure: Struct{}
interfaces: [.openapi]
methods: [
ActorMethod{
name: 'listPets'
summary: 'List all pets'
example: openrpc.ExamplePairing{
params: [
openrpc.ExampleRef(openrpc.Example{
name: 'Example limit'
description: 'Example Maximum number of pets to return'
value: 10
}),
]
result: openrpc.ExampleRef(openrpc.Example{
name: 'Example response'
value: json.raw_decode('[
{"id": 1, "name": "Fluffy", "tag": "dog"},
{"id": 2, "name": "Whiskers", "tag": "cat"}
]')!
})
}
parameters: [
ContentDescriptor{
name: 'limit'
summary: 'Maximum number of pets to return'
description: 'Maximum number of pets to return'
required: false
schema: SchemaRef(Schema{
typ: 'integer'
format: 'int32'
example: 10
})
},
]
result: ContentDescriptor{
name: 'result'
description: 'The response of the operation.'
required: true
schema: SchemaRef(Reference{
ref: '#/components/schemas/Pets'
})
}
errors: [
ErrorSpec{
code: 400
message: 'Invalid request'
},
]
},
ActorMethod{
name: 'createPet'
summary: 'Create a new pet'
example: openrpc.ExamplePairing{
result: openrpc.ExampleRef(openrpc.Example{
name: 'Example response'
value: '[]'
})
}
result: ContentDescriptor{
name: 'result'
description: 'The response of the operation.'
required: true
}
errors: [
ErrorSpec{
code: 400
message: 'Invalid input'
},
]
},
ActorMethod{
name: 'getPet'
summary: 'Get a pet by ID'
example: openrpc.ExamplePairing{
params: [
openrpc.ExampleRef(openrpc.Example{
name: 'Example petId'
description: 'Example ID of the pet to retrieve'
value: 1
}),
]
result: openrpc.ExampleRef(openrpc.Example{
name: 'Example response'
value: json.raw_decode('{"id": 1, "name": "Fluffy", "tag": "dog"}')!
})
}
parameters: [
ContentDescriptor{
name: 'petId'
summary: 'ID of the pet to retrieve'
description: 'ID of the pet to retrieve'
required: true
schema: SchemaRef(Schema{
typ: 'integer'
format: 'int64'
example: 1
})
},
]
result: ContentDescriptor{
name: 'result'
description: 'The response of the operation.'
required: true
schema: SchemaRef(Reference{
ref: '#/components/schemas/Pet'
})
}
errors: [
ErrorSpec{
code: 404
message: 'Pet not found'
},
]
},
ActorMethod{
name: 'deletePet'
summary: 'Delete a pet by ID'
example: openrpc.ExamplePairing{
params: [
openrpc.ExampleRef(openrpc.Example{
name: 'Example petId'
description: 'Example ID of the pet to delete'
value: 1
}),
]
}
parameters: [
ContentDescriptor{
name: 'petId'
summary: 'ID of the pet to delete'
description: 'ID of the pet to delete'
required: true
schema: SchemaRef(Schema{
typ: 'integer'
format: 'int64'
example: 1
})
},
]
result: ContentDescriptor{
name: 'result'
description: 'The response of the operation.'
required: true
}
errors: [
ErrorSpec{
code: 404
message: 'Pet not found'
},
]
},
]
objects: [
BaseObject{
schema: Schema{
typ: 'object'
properties: {
'id': SchemaRef(Schema{
typ: 'integer'
format: 'int64'
})
'name': SchemaRef(Schema{
typ: 'string'
})
'tag': SchemaRef(Schema{
typ: 'string'
})
}
required: ['id', 'name']
}
},
BaseObject{
schema: Schema{
typ: 'object'
properties: {
'name': SchemaRef(Schema{
typ: 'string'
})
'tag': SchemaRef(Schema{
typ: 'string'
})
}
required: ['name']
}
},
BaseObject{
schema: Schema{
typ: 'array'
items: jsonschema.Items(SchemaRef(Reference{
ref: '#/components/schemas/Pet'
}))
}
},
]
}
pub fn test_from_openapi() ! {
// panic(from_openapi(openapi_spec)!)
assert from_openapi(openapi_spec)! == actor_spec
}

View File

@@ -1,106 +0,0 @@
module specification
import incubaid.herolib.schemas.openrpc { ContentDescriptor, ErrorSpec, Method, OpenRPC }
import incubaid.herolib.schemas.jsonschema { Reference, Schema }
import incubaid.herolib.core.texttools
// Helper function: Convert OpenRPC Method to ActorMethod
fn openrpc_method_to_actor_method(method Method) ActorMethod {
mut parameters := []ContentDescriptor{}
mut errors := []ErrorSpec{}
// Process parameters
for param in method.params {
if param is ContentDescriptor {
parameters << param
} else {
panic('Method param should be inflated')
}
}
// Process errors
for err in method.errors {
if err is ErrorSpec {
errors << err
} else {
panic('Method error should be inflated')
}
}
if method.result is Reference {
panic('Method result should be inflated')
}
return ActorMethod{
name: method.name
description: method.description
summary: method.summary
parameters: parameters
result: method.result as ContentDescriptor
errors: errors
}
}
// // Helper function: Extract Structs from OpenRPC Components
// fn extract_structs_from_openrpc(openrpc OpenRPC) []Struct {
// mut structs := []Struct{}
// for schema_name, schema in openrpc.components.schemas {
// if schema is Schema {
// mut fields := []Struct.Field{}
// for field_name, field_schema in schema.properties {
// if field_schema is Schema {
// fields << Struct.Field{
// name: field_name
// typ: field_schema.to_code() or { panic(err) }
// description: field_schema.description
// required: field_name in schema.required
// }
// }
// }
// structs << Struct{
// name: schema_name
// description: schema.description
// fields: fields
// }
// }
// }
// return structs
// }
// Converts OpenRPC to ActorSpecification
pub fn from_openrpc(spec OpenRPC) !ActorSpecification {
mut methods := []ActorMethod{}
mut objects := []BaseObject{}
// Process methods
for method in spec.methods {
methods << openrpc_method_to_actor_method(spec.inflate_method(method))
}
// Process objects (schemas)
// structs := extract_structs_from_openrpc(spec)
for key, schema in spec.components.schemas {
if schema is Schema {
if schema.typ == 'object' {
objects << BaseObject{
schema: Schema{
...schema
title: texttools.pascal_case(key)
id: texttools.snake_case(key)
}
}
}
}
}
return ActorSpecification{
name: spec.info.title
description: spec.info.description
interfaces: [.openrpc]
methods: methods
objects: objects
}
}

View File

@@ -1,434 +0,0 @@
module specification
import incubaid.herolib.develop.codetools as code { Struct }
import incubaid.herolib.schemas.openrpc { ContentDescriptor }
import incubaid.herolib.schemas.openapi
import incubaid.herolib.schemas.jsonschema { Reference, Schema, SchemaRef }
const openrpc_spec = openrpc.OpenRPC{
openrpc: '1.0.0-rc1'
info: openrpc.Info{
title: 'Petstore'
license: openrpc.License{
name: 'MIT'
}
version: '1.0.0'
}
servers: [
openrpc.Server{
name: 'localhost'
url: openrpc.RuntimeExpression('http://localhost:8080')
},
]
methods: [
openrpc.Method{
name: 'list_pets'
summary: 'List all pets'
params: [
openrpc.ContentDescriptorRef(ContentDescriptor{
name: 'limit'
description: 'How many items to return at one time (max 100)'
required: false
schema: SchemaRef(Schema{
typ: 'integer'
minimum: 1
})
}),
]
result: openrpc.ContentDescriptorRef(ContentDescriptor{
name: 'pets'
description: 'A paged array of pets'
schema: SchemaRef(Schema{
typ: 'array'
items: jsonschema.Items(SchemaRef(Reference{
ref: '#/components/schemas/Pet'
}))
})
})
examples: [
openrpc.ExamplePairing{
name: 'listPetExample'
description: 'List pet example'
},
]
},
openrpc.Method{
name: 'create_pet'
summary: 'Create a pet'
params: [
openrpc.ContentDescriptorRef(ContentDescriptor{
name: 'newPetName'
description: 'Name of pet to create'
required: true
schema: SchemaRef(Schema{
typ: 'string'
})
}),
openrpc.ContentDescriptorRef(ContentDescriptor{
name: 'newPetTag'
description: 'Pet tag to create'
schema: SchemaRef(Schema{
typ: 'string'
})
}),
]
result: openrpc.ContentDescriptorRef(Reference{
ref: '#/components/contentDescriptors/PetId'
})
examples: [
openrpc.ExamplePairing{
name: 'createPetExample'
description: 'Create pet example'
},
]
},
openrpc.Method{
name: 'get_pet'
summary: 'Info for a specific pet'
params: [
openrpc.ContentDescriptorRef(Reference{
ref: '#/components/contentDescriptors/PetId'
}),
]
result: openrpc.ContentDescriptorRef(ContentDescriptor{
name: 'pet'
description: 'Expected response to a valid request'
schema: SchemaRef(Reference{
ref: '#/components/schemas/Pet'
})
})
examples: [
openrpc.ExamplePairing{
name: 'getPetExample'
description: 'Get pet example'
},
]
},
openrpc.Method{
name: 'update_pet'
summary: 'Update a pet'
params: [
openrpc.ContentDescriptorRef(Reference{
ref: '#/components/contentDescriptors/PetId'
}),
openrpc.ContentDescriptorRef(ContentDescriptor{
name: 'updatedPetName'
description: 'New name for the pet'
required: true
schema: SchemaRef(Schema{
typ: 'string'
})
}),
openrpc.ContentDescriptorRef(ContentDescriptor{
name: 'updatedPetTag'
description: 'New tag for the pet'
schema: SchemaRef(Schema{
typ: 'string'
})
}),
]
result: openrpc.ContentDescriptorRef(ContentDescriptor{
name: 'pet'
description: 'Updated pet object'
schema: SchemaRef(Reference{
ref: '#/components/schemas/Pet'
})
})
examples: [
openrpc.ExamplePairing{
name: 'updatePetExample'
description: 'Update pet example'
},
]
},
openrpc.Method{
name: 'delete_pet'
summary: 'Delete a pet'
params: [
openrpc.ContentDescriptorRef(Reference{
ref: '#/components/contentDescriptors/PetId'
}),
]
result: openrpc.ContentDescriptorRef(ContentDescriptor{
name: 'success'
description: 'Boolean indicating success'
schema: SchemaRef(Schema{
typ: 'boolean'
})
})
examples: [
openrpc.ExamplePairing{
name: 'deletePetExample'
description: 'Delete pet example'
},
]
},
]
components: openrpc.Components{
content_descriptors: {
'PetId': openrpc.ContentDescriptorRef(ContentDescriptor{
name: 'petId'
description: 'The ID of the pet'
required: true
schema: SchemaRef(Reference{
ref: '#/components/schemas/PetId'
})
})
}
schemas: {
'PetId': SchemaRef(Schema{
typ: 'integer'
minimum: 0
})
'Pet': SchemaRef(Schema{
typ: 'object'
properties: {
'id': SchemaRef(Reference{
ref: '#/components/schemas/PetId'
})
'name': SchemaRef(Schema{
typ: 'string'
})
'tag': SchemaRef(Schema{
typ: 'string'
})
}
required: ['id', 'name']
})
}
}
}
const actor_spec = ActorSpecification{
name: 'Petstore'
structure: Struct{
is_pub: false
}
interfaces: [.openrpc]
methods: [
ActorMethod{
name: 'list_pets'
summary: 'List all pets'
parameters: [
ContentDescriptor{
name: 'limit'
description: 'How many items to return at one time (max 100)'
required: false
schema: SchemaRef(Schema{
typ: 'integer'
minimum: 1
})
},
]
result: ContentDescriptor{
name: 'pets'
description: 'A paged array of pets'
schema: SchemaRef(Schema{
typ: 'array'
items: jsonschema.Items(SchemaRef(Schema{
typ: 'object'
properties: {
'id': SchemaRef(Reference{
ref: '#/components/schemas/PetId'
})
'name': SchemaRef(Schema{
typ: 'string'
})
'tag': SchemaRef(Schema{
typ: 'string'
})
}
required: [
'id',
'name',
]
}))
})
}
},
ActorMethod{
name: 'create_pet'
summary: 'Create a pet'
parameters: [
ContentDescriptor{
name: 'newPetName'
description: 'Name of pet to create'
required: true
schema: SchemaRef(Schema{
typ: 'string'
})
},
ContentDescriptor{
name: 'newPetTag'
description: 'Pet tag to create'
schema: SchemaRef(Schema{
typ: 'string'
})
},
]
},
ActorMethod{
name: 'get_pet'
summary: 'Info for a specific pet'
result: ContentDescriptor{
name: 'pet'
description: 'Expected response to a valid request'
schema: SchemaRef(Schema{
typ: 'object'
properties: {
'id': SchemaRef(Reference{
ref: '#/components/schemas/PetId'
})
'name': SchemaRef(Schema{
typ: 'string'
})
'tag': SchemaRef(Schema{
typ: 'string'
})
}
required: [
'id',
'name',
]
})
}
},
ActorMethod{
name: 'update_pet'
summary: 'Update a pet'
parameters: [
ContentDescriptor{
name: 'updatedPetName'
description: 'New name for the pet'
required: true
schema: SchemaRef(Schema{
typ: 'string'
})
},
ContentDescriptor{
name: 'updatedPetTag'
description: 'New tag for the pet'
schema: SchemaRef(Schema{
typ: 'string'
})
},
]
result: ContentDescriptor{
name: 'pet'
description: 'Updated pet object'
schema: SchemaRef(Schema{
typ: 'object'
properties: {
'id': SchemaRef(Reference{
ref: '#/components/schemas/PetId'
})
'name': SchemaRef(Schema{
typ: 'string'
})
'tag': SchemaRef(Schema{
typ: 'string'
})
}
required: [
'id',
'name',
]
})
}
},
ActorMethod{
name: 'delete_pet'
summary: 'Delete a pet'
result: ContentDescriptor{
name: 'success'
description: 'Boolean indicating success'
schema: SchemaRef(Schema{
typ: 'boolean'
})
}
},
]
objects: [
BaseObject{
schema: Schema{
id: 'pet'
title: 'Pet'
typ: 'object'
properties: {
'id': SchemaRef(Reference{
ref: '#/components/schemas/PetId'
})
'name': SchemaRef(Schema{
typ: 'string'
})
'tag': SchemaRef(Schema{
typ: 'string'
})
}
required: ['id', 'name']
}
},
]
}
pub fn test_from_openrpc() ! {
actor_spec_ := from_openrpc(openrpc_spec)!
assert actor_spec_.methods.len == actor_spec.methods.len
assert_methods_match(actor_spec_.methods[0], actor_spec.methods[0])
// assert from_openrpc(openrpc_spec)! == actor_spec
}
fn assert_methods_match(a ActorMethod, b ActorMethod) {
// Compare method names
assert a.name == b.name, 'Method names do not match: ${a.name} != ${b.name}'
// Compare summaries
assert a.summary == b.summary, 'Method summaries do not match for method ${a.name}.'
// Compare descriptions
assert a.description == b.description, 'Method descriptions do not match for method ${a.name}.'
// Compare parameters count
assert a.parameters.len == b.parameters.len, 'Parameter counts do not match for method ${a.name}.'
// Compare each parameter
for i, param_a in a.parameters {
assert_params_match(param_a, b.parameters[i], a.name)
}
// Compare result
assert_params_match(a.result, b.result, a.name)
}
fn assert_params_match(a ContentDescriptor, b ContentDescriptor, method_name string) {
// Compare parameter names
assert a.name == b.name, 'Parameter names do not match in method ${method_name}: ${a.name} != ${b.name}'
// Compare summaries
assert a.summary == b.summary, 'Parameter summaries do not match in method ${method_name}: ${a.name}'
// Compare descriptions
assert a.description == b.description, 'Parameter descriptions do not match in method ${method_name}: ${a.name}'
// Compare required flags
assert a.required == b.required, 'Required flags do not match in method ${method_name}: ${a.name}'
// Compare schemas
// assert_schemas_match(a.schema, b.schema, method_name, a.name)
}
// fn assert_schemas_match(a jsonschema.SchemaRef, b jsonschema.SchemaRef, method_name string, param_name string) {
// if a is Schema &&
// // Compare schema types
// assert a.typ == b.typ, 'Schema types do not match for parameter ${param_name} in method ${method_name}: ${a.typ} != ${b.typ}'
// // Compare schema titles
// assert a.title == b.title, 'Schema titles do not match for parameter ${param_name} in method ${method_name}.'
// // Compare schema descriptions
// assert a.description == b.description, 'Schema descriptions do not match for parameter ${param_name} in method ${method_name}.'
// // Compare other schema fields as needed (e.g., properties, additional properties, items, etc.)
// // Add more checks here if needed for deeper schema comparisons
// }

View File

@@ -1,205 +0,0 @@
module specification
import incubaid.herolib.develop.codetools as code { Struct }
import incubaid.herolib.schemas.openapi
import incubaid.herolib.schemas.openrpc { ContentDescriptor, ErrorSpec, ExamplePairing }
import incubaid.herolib.schemas.jsonschema { Reference, Schema }
pub struct ActorSpecification {
pub mut:
version string = '1.0.0'
openapi ?openapi.OpenAPI
openrpc ?openrpc.OpenRPC
name string @[omitempty]
description string @[omitempty]
structure Struct @[omitempty]
interfaces []ActorInterface @[omitempty]
methods []ActorMethod @[omitempty]
objects []BaseObject @[omitempty]
}
pub enum ActorInterface {
openrpc
openapi
webui
command
http
}
pub struct ActorMethod {
pub:
name string @[omitempty]
description string @[omitempty]
summary string
example ExamplePairing
parameters []ContentDescriptor
result ContentDescriptor
errors []ErrorSpec
category MethodCategory
}
pub struct BaseObject {
pub mut:
schema Schema
new_method ?ActorMethod
get_method ?ActorMethod
set_method ?ActorMethod
delete_method ?ActorMethod
list_method ?ActorMethod
filter_method ?ActorMethod
other_methods []ActorMethod
}
pub enum MethodCategory {
base_object_new
base_object_get
base_object_set
base_object_delete
base_object_list
other
}
// returns whether method belongs to a given base object
// TODO: link to more info about base object methods
fn (m ActorMethod) belongs_to_object(obj BaseObject) bool {
base_obj_is_param := m.parameters
.filter(it.schema is Schema)
.map(it.schema as Schema)
.any(it.id == obj.schema.id)
base_obj_is_result := if m.result.schema is Schema {
m.result.schema.id == obj.schema.id
} else {
ref := m.result.schema as Reference
ref.ref.all_after_last('/') == obj.name()
}
return base_obj_is_param || base_obj_is_result
}
pub fn (s ActorSpecification) validate() ActorSpecification {
mut validated_objects := []BaseObject{}
for obj_ in s.objects {
mut obj := obj_
if obj.schema.id == '' {
obj.schema.id = obj.schema.title
}
methods := s.methods.filter(it.belongs_to_object(obj))
if m := methods.filter(it.is_new_method())[0] {
obj.new_method = m
}
if m := methods.filter(it.is_set_method())[0] {
obj.set_method = m
}
if m := methods.filter(it.is_get_method())[0] {
obj.get_method = m
}
if m := methods.filter(it.is_delete_method())[0] {
obj.delete_method = m
}
if m := methods.filter(it.is_list_method())[0] {
obj.list_method = m
}
validated_objects << BaseObject{
...obj
other_methods: methods.filter(!it.is_crudlf_method())
}
}
return ActorSpecification{
...s
objects: validated_objects
}
}
// method category returns what category a method falls under
pub fn (s ActorSpecification) method_type(method ActorMethod) MethodCategory {
return if s.is_base_object_new_method(method) {
.base_object_new
} else if s.is_base_object_get_method(method) {
.base_object_get
} else if s.is_base_object_set_method(method) {
.base_object_set
} else if s.is_base_object_delete_method(method) {
.base_object_delete
} else if s.is_base_object_list_method(method) {
.base_object_list
} else {
.other
}
}
// a base object method is a method that is a
// CRUD+list+filter method of a base object
fn (s ActorSpecification) is_base_object_method(method ActorMethod) bool {
base_obj_is_param := method.parameters
.filter(it.schema is Schema)
.map(it.schema as Schema)
.any(it.id in s.objects.map(it.schema.id))
base_obj_is_result := if method.result.schema is Schema {
method.result.schema.id in s.objects.map(it.name())
} else {
ref := method.result.schema as Reference
ref.ref.all_after_last('/') in s.objects.map(it.name())
}
return base_obj_is_param || base_obj_is_result
}
fn (m ActorMethod) is_new_method() bool {
return m.name.starts_with('new')
}
fn (m ActorMethod) is_get_method() bool {
return m.name.starts_with('get')
}
fn (m ActorMethod) is_set_method() bool {
return m.name.starts_with('set')
}
fn (m ActorMethod) is_delete_method() bool {
return m.name.starts_with('delete')
}
fn (m ActorMethod) is_list_method() bool {
return m.name.starts_with('list')
}
fn (m ActorMethod) is_filter_method() bool {
return m.name.starts_with('filter')
}
fn (m ActorMethod) is_crudlf_method() bool {
return m.is_new_method() || m.is_get_method() || m.is_set_method() || m.is_delete_method()
|| m.is_list_method() || m.is_filter_method()
}
pub fn (o BaseObject) name() string {
return if o.schema.id.trim_space() != '' {
o.schema.id.trim_space()
} else {
o.schema.title.trim_space()
}
}
fn (s ActorSpecification) is_base_object_new_method(method ActorMethod) bool {
return s.is_base_object_method(method) && method.name.starts_with('new')
}
fn (s ActorSpecification) is_base_object_get_method(method ActorMethod) bool {
return s.is_base_object_method(method) && method.name.starts_with('get')
}
fn (s ActorSpecification) is_base_object_set_method(method ActorMethod) bool {
return s.is_base_object_method(method) && method.name.starts_with('set')
}
fn (s ActorSpecification) is_base_object_delete_method(method ActorMethod) bool {
return s.is_base_object_method(method) && method.name.starts_with('delete')
}
fn (s ActorSpecification) is_base_object_list_method(method ActorMethod) bool {
return s.is_base_object_method(method) && method.name.starts_with('list')
}

View File

@@ -1,97 +0,0 @@
module specification
import incubaid.herolib.schemas.jsonschema { Schema, SchemaRef }
import incubaid.herolib.schemas.openapi { Components, Info, MediaType, OpenAPI, Operation, Parameter, PathItem, ResponseSpec, ServerSpec }
import net.http
// Converts ActorSpecification to OpenAPI
pub fn (s ActorSpecification) to_openapi() OpenAPI {
if openapi_spec := s.openapi {
return openapi_spec
}
mut paths := map[string]PathItem{}
// Map ActorMethods to paths
for method in s.methods {
op := method.to_openapi_operation()
paths['${method.http_path()}'] = match method.http_method() {
.get {
PathItem{
get: op
}
}
else {
panic('unsupported http method')
}
}
// Assign operation to corresponding HTTP method
// TODO: what about other verbs
}
mut schemas := map[string]SchemaRef{}
for object in s.objects {
schemas[object.schema.id] = object.to_schema()
}
return OpenAPI{
openapi: '3.0.0'
info: Info{
title: s.name
summary: s.description
description: s.description
version: '1.0.0'
}
servers: [
ServerSpec{
url: 'http://localhost:8080'
description: 'Default server'
},
]
paths: paths
components: Components{
schemas: schemas
}
}
}
fn (bo BaseObject) to_schema() Schema {
return Schema{}
}
fn (m ActorMethod) http_path() string {
return m.name
}
fn (m ActorMethod) http_method() http.Method {
return .get
}
fn (method ActorMethod) to_openapi_operation() Operation {
mut op := Operation{
summary: method.summary
description: method.description
operation_id: method.name
}
// Convert parameters to OpenAPI format
for param in method.parameters {
op.parameters << Parameter{
name: param.name
in_: 'query' // Default to query parameters; adjust based on function context
description: param.description
required: param.required
schema: param.schema
}
}
// if method.is_void()
op.responses['200'] = ResponseSpec{
description: method.description
content: {
'application/json': MediaType{
schema: method.result.schema
}
}
}
return op
}

View File

@@ -1,183 +0,0 @@
module specification
import incubaid.herolib.develop.codetools as code
import incubaid.herolib.schemas.jsonschema { Schema, SchemaRef }
import incubaid.herolib.schemas.openapi
import incubaid.herolib.schemas.openrpc
const actor_spec = ActorSpecification{
name: 'Petstore'
structure: code.Struct{
is_pub: false
}
interfaces: [.openrpc]
methods: [
ActorMethod{
name: 'list_pets'
summary: 'List all pets'
parameters: [
openrpc.ContentDescriptor{
name: 'limit'
description: 'How many items to return at one time (max 100)'
required: false
schema: SchemaRef(Schema{
typ: 'integer'
minimum: 1
})
},
]
result: openrpc.ContentDescriptor{
name: 'pets'
description: 'A paged array of pets'
schema: SchemaRef(Schema{
typ: 'array'
items: jsonschema.Items(SchemaRef(Schema{
typ: 'object'
properties: {
'id': SchemaRef(jsonschema.Reference{
ref: '#/components/schemas/PetId'
})
'name': SchemaRef(Schema{
typ: 'string'
})
'tag': SchemaRef(Schema{
typ: 'string'
})
}
required: [
'id',
'name',
]
}))
})
}
},
ActorMethod{
name: 'create_pet'
summary: 'Create a pet'
parameters: [
openrpc.ContentDescriptor{
name: 'newPetName'
description: 'Name of pet to create'
required: true
schema: SchemaRef(Schema{
typ: 'string'
})
},
openrpc.ContentDescriptor{
name: 'newPetTag'
description: 'Pet tag to create'
schema: SchemaRef(Schema{
typ: 'string'
})
},
]
},
ActorMethod{
name: 'get_pet'
summary: 'Info for a specific pet'
result: openrpc.ContentDescriptor{
name: 'pet'
description: 'Expected response to a valid request'
schema: SchemaRef(Schema{
typ: 'object'
properties: {
'id': SchemaRef(jsonschema.Reference{
ref: '#/components/schemas/PetId'
})
'name': SchemaRef(Schema{
typ: 'string'
})
'tag': SchemaRef(Schema{
typ: 'string'
})
}
required: [
'id',
'name',
]
})
}
},
ActorMethod{
name: 'update_pet'
summary: 'Update a pet'
parameters: [
openrpc.ContentDescriptor{
name: 'updatedPetName'
description: 'New name for the pet'
required: true
schema: SchemaRef(Schema{
typ: 'string'
})
},
openrpc.ContentDescriptor{
name: 'updatedPetTag'
description: 'New tag for the pet'
schema: SchemaRef(Schema{
typ: 'string'
})
},
]
result: openrpc.ContentDescriptor{
name: 'pet'
description: 'Updated pet object'
schema: SchemaRef(Schema{
typ: 'object'
properties: {
'id': SchemaRef(jsonschema.Reference{
ref: '#/components/schemas/PetId'
})
'name': SchemaRef(Schema{
typ: 'string'
})
'tag': SchemaRef(Schema{
typ: 'string'
})
}
required: [
'id',
'name',
]
})
}
},
ActorMethod{
name: 'delete_pet'
summary: 'Delete a pet'
result: openrpc.ContentDescriptor{
name: 'success'
description: 'Boolean indicating success'
schema: SchemaRef(Schema{
typ: 'boolean'
})
}
},
]
objects: [
BaseObject{
schema: Schema{
id: 'pet'
title: 'Pet'
typ: 'object'
properties: {
'id': SchemaRef(jsonschema.Reference{
ref: '#/components/schemas/PetId'
})
'name': SchemaRef(Schema{
typ: 'string'
})
'tag': SchemaRef(Schema{
typ: 'string'
})
}
required: ['id', 'name']
}
},
]
}
// Converts ActorSpecification to OpenAPI
pub fn test_specification_to_openapi() {
panic(actor_spec.to_openapi())
}

View File

@@ -1,71 +0,0 @@
module specification
import incubaid.herolib.schemas.openrpc { Components, OpenRPC }
import incubaid.herolib.schemas.jsonschema { SchemaRef }
import incubaid.herolib.schemas.jsonschema.codegen
// pub fn from_openrpc(spec openrpc.OpenRPC) !ActorSpecification {
// // Extract Actor metadata from OpenRPC info
// // actor_name := openrpc_doc.info.title
// // actor_description := openrpc_doc.info.description
// // // Generate methods
// // mut methods := []ActorMethod{}
// // for method in openrpc_doc.methods {
// // method_code := method.to_code()! // Using provided to_code function
// // methods << ActorMethod{
// // name: method.name
// // func: method_code
// // }
// // }
// // // Generate BaseObject structs from schemas
// // mut objects := []BaseObject{}
// // for key, schema_ref in openrpc_doc.components.schemas {
// // struct_obj := schema_ref.to_code()! // Assuming schema_ref.to_code() converts schema to Struct
// // // objects << BaseObject{
// // // structure: code.Struct{
// // // name: struct_obj.name
// // // }
// // // }
// // }
// // Build the Actor struct
// return ActorSpecification{
// // name: actor_name
// // description: actor_description
// // methods: methods
// // objects: objects
// }
// }
pub fn (specification ActorSpecification) to_openrpc() OpenRPC {
mut schemas := map[string]SchemaRef{}
for obj in specification.objects {
schemas[obj.schema.id] = obj.schema
// for child in obj.children {
// schemas[child.name] = struct_to_schema(child)
// }
}
return OpenRPC{
info: openrpc.Info{
title: specification.name.title()
version: '1.0.0'
}
methods: specification.methods.map(method_to_openrpc_method(it))
components: Components{
schemas: schemas
}
}
}
pub fn method_to_openrpc_method(method ActorMethod) openrpc.Method {
return openrpc.Method{
name: method.name
summary: method.summary
description: method.description
params: method.parameters.map(openrpc.ContentDescriptorRef(it))
result: openrpc.ContentDescriptorRef(method.result)
errors: method.errors.map(openrpc.ErrorRef(it))
}
}

View File

@@ -1,140 +0,0 @@
# Stage Module
The **Stage** module is a core component of the **Baobab** (Base Object and Actor Backend) library. It provides the infrastructure for handling RPC-based communication and managing the lifecycle of **Actors** and **Actions**. This module facilitates processing incoming requests, converting them to actions, and ensuring their correct execution.
## Architecture Overview
The **Stage** module operates based on the following architecture:
1. **RPC Request Handling**:
- An **Interface Handler** receives an RPC request. Supported interfaces include:
- **OpenRPC**
- **JSON-RPC**
- **OpenAPI**
2. **Action Creation**:
- The **Interface Handler** converts the incoming request into an **Action**, which represents the task to be executed.
3. **Action Execution**:
- The **Interface Handler** passes the **Action** to the **Director** for coordinated execution.
- (Note: Currently, the **Director** is not fully implemented. Actions are passed directly to the **Actor** for execution.)
4. **Actor Processing**:
- The **Actor** uses its `act` method to execute the **Action**.
- The result of the **Action** is stored in its `result` field, and the **Action** is returned.
5. **RPC Response Generation**:
- The **Interface Handler** converts the resulting **Action** back into the appropriate RPC response format and returns it.
---
## Key Components
### **Interface Handlers**
- **Responsibilities**:
- Receive and parse incoming RPC requests.
- Convert requests into **Actions**.
- Convert resulting **Actions** into appropriate RPC responses.
- Files:
- `interfaces/jsonrpc_interface.v`
- `interfaces/openapi_interface.v`
### **Director**
- **Responsibilities**:
- (Planned) Coordinate the execution of **Actions**.
- Handle retries, timeouts, and error recovery.
- File:
- `director.v`
### **Actors**
- **Responsibilities**:
- Execute **Actions** using their `act` method.
- Populate the `result` field of **Actions** with the execution result.
- File:
- `actor.v`
### **Actions**
- **Responsibilities**:
- Represent tasks to be executed by **Actors**.
- Carry results back after execution.
- File:
- `action.v`
### **Executor**
- **Responsibilities**:
- Manage the assignment of **Actions** to **Actors**.
- File:
- `executor.v`
---
## Directory Structure
```
stage/
interfaces/
jsonrpc_interface.v # Converts JSON-RPC requests to Actions
openapi_interface.v # Converts OpenAPI requests to Actions
actor.v # Defines the Actor and its behavior
action.v # Defines the Action structure and utilities
executor.v # Executes Actions on Actors
director.v # (Planned) Coordinates actors, actions, and retries
```
---
## Workflow Example
### 1. Receiving an RPC Request
An RPC request is received by an interface handler:
```json
{
"jsonrpc": "2.0",
"method": "doSomething",
"params": { "key": "value" },
"id": 1
}
```
### 2. Converting the Request to an Action
The interface handler converts the request into an **Action**:
```v
action := jsonrpc_interface.jsonrpc_to_action(request)
```
### 3. Executing the Action
The action is passed directly to an **Actor** for execution:
```v
actor := MyActor{id: "actor-1"}
resulting_action := actor.act(action)
```
### 4. Returning the RPC Response
The interface handler converts the resulting **Action** back into a JSON-RPC response:
```json
{
"jsonrpc": "2.0",
"result": { "status": "success", "data": "..." },
"id": 1
}
```
---
## Future Improvements
- **Director Implementation**:
- Add retries and timeout handling for actions.
- Provide better coordination for complex workflows.
- **Enhanced Interfaces**:
- Add support for more RPC protocols.
---
This module is a crucial building block of the **Baobab** library, designed to streamline RPC-based communication and task execution with flexibility and scalability.

View File

@@ -1,15 +0,0 @@
module stage
// import incubaid.herolib.core.smartid
pub struct Action {
pub mut:
id string
name string
priority int = 10 // 0 is highest, do 10 as default
params string // json encoded params
result string // can be used to remember outputs
// run bool = true // certain actions can be defined but meant to be executed directly
comments string
done bool // if done then no longer need to process
}

View File

@@ -1,47 +0,0 @@
module stage
import incubaid.herolib.core.redisclient
// Processor struct for managing procedure calls
pub struct Client {
pub mut:
rpc redisclient.RedisRpc // Redis RPC mechanism
}
// Parameters for processing a procedure call
@[params]
pub struct Params {
pub:
timeout int // Timeout in seconds
}
pub struct ClientConfig {
ActorConfig
pub:
redis_url string = 'localhost:6379' // url to redis server running
}
pub fn new_client(config ActorConfig) !Client {
mut redis := redisclient.new(config.redis_url)!
mut rpc_q := redis.rpc_get(config.redis_queue_name())
return Client{
rpc: rpc_q
}
}
// Process the procedure call
pub fn (mut p Client) call_to_action(action Action, params Params) !Action {
// Use RedisRpc's `call` to send the call and wait for the response
response_data := p.rpc.call(redisclient.RPCArgs{
cmd: action.name
data: action.params
timeout: u64(params.timeout * 1000) // Convert seconds to milliseconds
wait: true
})!
return Action{
...action
result: response_data
}
}

View File

@@ -1,79 +0,0 @@
module stage
import incubaid.herolib.baobab.osis { OSIS }
import incubaid.herolib.core.redisclient
@[heap]
pub interface IActor {
name string
mut:
act(Action) !Action
}
pub struct Actor {
ActorConfig
mut:
osis OSIS
}
@[params]
pub struct ActorConfig {
pub:
name string
version string
redis_url string = 'localhost:6379'
}
pub fn (config ActorConfig) redis_queue_name() string {
mut str := 'actor_${config.name}'
if config.version != '' {
str += '_${config.version}'
}
return str
}
pub fn new_actor(config ActorConfig) !Actor {
return Actor{
ActorConfig: config
osis: osis.new()!
}
}
pub fn (a ActorConfig) get_redis_rpc() !redisclient.RedisRpc {
mut redis := redisclient.new(a.redis_url)!
return redis.rpc_get(a.redis_queue_name())
}
pub fn (a ActorConfig) version(v string) ActorConfig {
return ActorConfig{
...a
version: v
}
}
pub fn (a ActorConfig) example() ActorConfig {
return ActorConfig{
...a
version: 'example'
}
}
pub fn (mut a IActor) handle(method string, data string) !string {
action := a.act(
name: method
params: data
)!
return action.result
}
// // Actor listens to the Redis queue for method invocations
// pub fn (mut a IActor) run() ! {
// mut redis := redisclient.new('localhost:6379') or { panic(err) }
// mut rpc := redis.rpc_get(a.name)
// println('Actor started and listening for tasks...')
// for {
// rpc.process(a.handle)!
// time.sleep(time.millisecond * 100) // Prevent CPU spinning
// }
// }

View File

@@ -1 +0,0 @@
module stage

View File

@@ -1,33 +0,0 @@
module stage
// Error struct for error handling
pub struct ActionError {
reason ErrorReason
}
// Enum for different error reasons
pub enum ErrorReason {
timeout
serialization_failed
deserialization_failed
enqueue_failed
}
pub fn (err ActionError) code() int {
return match err.reason {
.timeout { 408 } // HTTP 408 Request Timeout
.serialization_failed { 500 } // HTTP 500 Internal Server Error
.deserialization_failed { 500 } // HTTP 500 Internal Server Error
.enqueue_failed { 503 } // HTTP 503 Service Unavailable
}
}
pub fn (err ActionError) msg() string {
explanation := match err.reason {
.timeout { 'The procedure call timed out.' }
.serialization_failed { 'Failed to serialize the procedure call.' }
.deserialization_failed { 'Failed to deserialize the procedure response.' }
.enqueue_failed { 'Failed to enqueue the procedure response.' }
}
return 'Procedure failed: ${explanation}'
}

View File

@@ -1,16 +0,0 @@
module interfaces
import incubaid.herolib.schemas.jsonrpc
import incubaid.herolib.baobab.stage { Action }
pub fn action_from_jsonrpc_request(request jsonrpc.Request) Action {
return Action{
id: request.id
name: request.method
params: request.params
}
}
pub fn action_to_jsonrpc_response(action Action) jsonrpc.Response {
return jsonrpc.new_response(action.id, action.result)
}

View File

@@ -1,48 +0,0 @@
module interfaces
import rand
import x.json2 as json { Any }
import incubaid.herolib.baobab.stage { Action, Client }
import incubaid.herolib.schemas.jsonrpc
import incubaid.herolib.schemas.openapi
pub struct OpenAPIInterface {
pub mut:
client Client
}
pub fn new_openapi_interface(client Client) &OpenAPIInterface {
return &OpenAPIInterface{client}
}
pub fn (mut i OpenAPIInterface) handle(request openapi.Request) !openapi.Response {
// Convert incoming OpenAPI request to a procedure call
action := action_from_openapi_request(request)
response := i.client.call_to_action(action) or { return err }
return action_to_openapi_response(response)
}
pub fn action_from_openapi_request(request openapi.Request) Action {
mut params := []Any{}
if request.arguments.len > 0 {
params << request.arguments.values()
}
if request.body != '' {
params << request.body
}
if request.parameters.len > 0 {
params << json.encode(request.parameters)
}
return Action{
id: rand.uuid_v4()
name: request.operation.operation_id
params: json.encode(params.str())
}
}
pub fn action_to_openapi_response(action Action) openapi.Response {
return openapi.Response{
body: action.result
}
}

View File

@@ -1,29 +0,0 @@
module interfaces
import incubaid.herolib.baobab.stage { Client }
import incubaid.herolib.schemas.jsonrpc
// handler for test echoes JSONRPC Request as JSONRPC Response
fn handler(request jsonrpc.Request) !jsonrpc.Response {
return jsonrpc.Response{
jsonrpc: request.jsonrpc
id: request.id
result: request.params
}
}
pub struct OpenRPCInterface {
pub mut:
client Client
}
pub fn new_openrpc_interface(client Client) &OpenRPCInterface {
return &OpenRPCInterface{client}
}
pub fn (mut i OpenRPCInterface) handle(request jsonrpc.Request) !jsonrpc.Response {
// Convert incoming OpenAPI request to a procedure call
action := action_from_jsonrpc_request(request)
response := i.client.call_to_action(action)!
return action_to_jsonrpc_response(response)
}

View File

@@ -1,91 +0,0 @@
module interfaces
// import os
// import time
// import veb
// import x.json2 {Any}
// import net.http
import incubaid.herolib.baobab.stage { Action }
import incubaid.herolib.schemas.openapi { Request }
pub fn openapi_request_to_action(request Request) Action {
// // Convert incoming OpenAPI request to a procedure call
// mut params := []Any{}
// if request.arguments.len > 0 {
// params << request.arguments.values().map(it.str()).clone()
// }
// if request.body != '' {
// params << request.body
// }
// if request.parameters != '' {
// params << request.body
// }
// if request.parameters.len != 0 {
// mut param_map := map[string]Any{} // Store parameters with correct types
// for param_name, param_value in request.parameters {
// operation_param := request.operation.parameters.filter(it.name == param_name)
// if operation_param.len > 0 {
// param_schema := operation_param[0].schema as Schema
// param_type := param_schema.typ
// param_format := param_schema.format
// // Convert parameter value to corresponding type
// match param_type {
// 'integer' {
// match param_format {
// 'int32' {
// param_map[param_name] = param_value.int() // Convert to int
// }
// 'int64' {
// param_map[param_name] = param_value.i64() // Convert to i64
// }
// else {
// param_map[param_name] = param_value.int() // Default to int
// }
// }
// }
// 'string' {
// param_map[param_name] = param_value // Already a string
// }
// 'boolean' {
// param_map[param_name] = param_value.bool() // Convert to bool
// }
// 'number' {
// match param_format {
// 'float' {
// param_map[param_name] = param_value.f32() // Convert to float
// }
// 'double' {
// param_map[param_name] = param_value.f64() // Convert to double
// }
// else {
// param_map[param_name] = param_value.f64() // Default to double
// }
// }
// }
// else {
// param_map[param_name] = param_value // Leave as string for unknown types
// }
// }
// } else {
// // If the parameter is not defined in the OpenAPI operation, skip or log it
// println('Unknown parameter: $param_name')
// }
// }
// // Encode the parameter map to JSON if needed
// params << json.encode(param_map.str())
// }
// call := Action{
// name: request.operation.operation_id
// params_json: json2.encode(params.str()) // Keep as a string since ProcedureCall expects a string
// }
// return call
return Action{}
}

View File

@@ -1,43 +0,0 @@
module interfaces
import incubaid.herolib.schemas.openapi { OpenAPI }
import incubaid.herolib.baobab.stage { ClientConfig }
import incubaid.herolib.schemas.openrpc { OpenRPC }
import veb
pub struct HTTPServer {
veb.Controller
}
pub struct Context {
veb.Context
}
pub struct HTTPServerConfig {
ClientConfig
pub:
openapi_specification OpenAPI
openrpc_specification OpenRPC
}
pub fn new_http_server() !&HTTPServer {
mut s := &HTTPServer{}
// client := actor.new_client(cfg.ClientConfig)!
// openapi_proxy := new_openapi_proxy(
// client: new_client(cfg.ClientConfig)!
// specification: cfg.openapi_spec
// )
// mut openrpc_controller := openrpc.new_http_controller(
// specification: cfg.openrpc_specification
// handler: new_openrpc_interface(client)
// )
// s.register_controller[openrpc.HTTPController, Context]('/openrpc', mut openrpc_controller)!
return s
}
pub fn (mut server HTTPServer) run() {
veb.run[HTTPServer, Context](mut server, 8082)
}

View File

@@ -1,164 +0,0 @@
module buildah
import incubaid.herolib.osal.core as osal
// import incubaid.herolib.ui.console
import incubaid.herolib.installers.lang.herolib
import incubaid.herolib.core.pathlib
import incubaid.herolib.builder
import incubaid.herolib.virt.utils
import os
import json
// Use shared container status from utils
pub type ContainerStatus = utils.ContainerStatus
pub struct IPAddress {
pub mut:
ipv4 string
ipv6 string
}
// need to fill in what is relevant
@[heap]
pub struct BuildAHContainer {
pub mut:
id string
builder bool
imageid string
imagename string
containername string
//TODO: not sure all below is needed
hero_in_container bool //once the hero has been installed this is on, does it once per session
// created time.Time
// ssh_enabled bool // if yes make sure ssh is enabled to the container
// ipaddr IPAddress
// forwarded_ports []string
// mounts []ContainerVolume
// ssh_port int // ssh port on node that is used to get ssh
// ports []string
// networks []string
// labels map[string]string @[str: skip]
// status ContainerStatus
// memsize int // in MB
// command string
}
@[params]
pub struct RunArgs {
pub mut:
cmd string
// TODO:/..
}
@[params]
pub struct PackageInstallArgs {
pub mut:
names string
// TODO:/..
}
// TODO: mimic osal.package_install('mc,tmux,git,rsync,curl,screen,redis,wget,git-lfs')!
// pub fn (mut self BuildAHContainer) package_install(args PackageInstallArgs) !{
// //TODO
// names := texttools.to_array(args.names)
// //now check which OS, need to make platform function on container level so we know which platform it is
// panic("implement")
// }
pub fn (mut self BuildAHContainer) copy(src string, dest string) ! {
mut executor := utils.buildah_exec(false)
executor.exec(['copy', self.id, src, dest]) or {
return utils.new_build_error('copy', self.containername, err.code(), err.msg(), err.msg())
}
}
pub fn (mut self BuildAHContainer) shell() ! {
mut executor := utils.buildah_exec(false)
executor.exec_interactive(['run', '--terminal', '--env', 'TERM=xterm', self.id, '/bin/bash']) or {
return utils.new_build_error('shell', self.containername, err.code(), err.msg(), err.msg())
}
}
pub fn (mut self BuildAHContainer) clean() ! {
cmd := '
#set -x
set +e
rm -rf /root/.rustup/toolchains/stable-x86_64-unknown-linux-gnu/share/doc
#pacman -Rns $(pacman -Qtdq) --noconfirm
#pacman -Scc --noconfirm
rm -rf /var/lib/pacman/sync/*
rm -rf /tmp/*
rm -rf /var/tmp/*
find /var/log -type f -name "*.log" -exec truncate -s 0 {} \\;
rm -rf /home/*/.cache/*
rm -rf /usr/share/doc/*
rm -rf /usr/share/man/*
rm -rf /usr/share/info/*
rm -rf /usr/share/licenses/*
find /usr/share/locale -mindepth 1 -maxdepth 1 ! -name "en*" -exec rm -rf {} \\;
rm -rf /usr/share/i18n
rm -rf /usr/share/icons/*
rm -rf /usr/lib/modules/*
rm -rf /var/cache/pacman
journalctl --vacuum-time=1s
'
self.exec(cmd: cmd, stdout: false)!
}
pub fn (mut self BuildAHContainer) delete() ! {
mut executor := utils.buildah_exec(false)
executor.exec(['rm', self.containername]) or {
return utils.new_build_error('delete', self.containername, err.code(), err.msg(), err.msg())
}
}
pub fn (mut self BuildAHContainer) inspect() !BuilderInfo {
cmd := 'buildah inspect ${self.containername}'
job := self.exec(cmd:(cmd)!
out:=job.output
mut r := json.decode(BuilderInfo, out) or {
return error('Failed to decode JSON for inspect: ${err}')
}
return r
}
// mount the build container to a path and return the path where its mounted
pub fn (mut self BuildAHContainer) mount_to_path() !string {
cmd := 'buildah mount ${self.containername}'
out := self.exec(cmd:cmd)!
return out.trim_space()
}
pub fn (mut self BuildAHContainer) commit(image_name string) ! {
// Validate image name
validated_name := utils.validate_image_name(image_name) or {
return utils.new_validation_error('image_name', image_name, err.msg())
}
mut executor := utils.buildah_exec(false)
executor.exec(['commit', self.containername, validated_name]) or {
return utils.new_build_error('commit', self.containername, err.code(), err.msg(), err.msg())
}
}
pub fn (self BuildAHContainer) set_entrypoint(entrypoint string) ! {
mut executor := utils.buildah_exec(false)
executor.exec(['config', '--entrypoint', entrypoint, self.containername]) or {
return utils.new_build_error('set_entrypoint', self.containername, err.code(), err.msg(), err.msg())
}
}
pub fn (self BuildAHContainer) set_workingdir(workdir string) ! {
mut executor := utils.buildah_exec(false)
executor.exec(['config', '--workingdir', workdir, self.containername]) or {
return utils.new_build_error('set_workingdir', self.containername, err.code(), err.msg(), err.msg())
}
}
pub fn (self BuildAHContainer) set_cmd(command string) ! {
mut executor := utils.buildah_exec(false)
executor.exec(['config', '--cmd', command, self.containername]) or {
return utils.new_build_error('set_cmd', self.containername, err.code(), err.msg(), err.msg())
}
}

View File

@@ -1,31 +0,0 @@
module buildah
import incubaid.herolib.osal.core as osal
// import incubaid.herolib.ui.console
import incubaid.herolib.installers.lang.herolib
import incubaid.herolib.core.pathlib
import os
import json
pub fn (mut self BuildAHContainer) install_zinit() ! {
// https://github.com/threefoldtech/zinit
self.hero_copy()!
self.hero_play_execute('!!installer.zinit')
// TODO: implement by making sure hero is in the build context and then use hero cmd to install this
self.set_entrypoint('/sbin/zinit init --container')!
}
pub fn (mut self BuildAHContainer) install_herodb() ! {
self.install_zinit()!
// the hero database gets installed and put in zinit for automatic start
self.hero_play_execute('!!installer.herodb')
// TODO: the hero_play needs to be implemented
}
// copies the hero from host into guest
pub fn (mut self BuildAHContainer) install_mycelium() ! {
self.install_zinit()!
// the mycelium database gets installed and put in zinit for automatic start
self.hero_play_execute('!!installer.mycelium')
// TODO: the hero_play needs to be implemented
}

View File

@@ -1,86 +0,0 @@
module buildah
import incubaid.herolib.osal.core as osal
import incubaid.herolib.core.pathlib
import os
@[params]
pub struct Command {
pub mut:
name string // to give a name to your command, good to see logs...
cmd string
description string
timeout int = 3600 // timeout in sec
stdout bool = true
stdout_log bool = true
raise_error bool = true // if false, will not raise an error but still error report
ignore_error bool // means if error will just exit and not raise, there will be no error reporting
work_folder string // location where cmd will be executed
environment map[string]string // env variables
ignore_error_codes []int
scriptpath string // is the path where the script will be put which is executed
scriptkeep bool // means we don't remove the script
debug bool // if debug will put +ex in the script which is being executed and will make sure script stays
shell bool // means we will execute it in a shell interactive
retry int
interactive bool = true
async bool
runtime osal.RunTime
}
pub enum RunTime {
bash
python
heroscript
herocmd
v
}
// should use builders underneith
pub fn (mut self BuildAHContainer) exec(cmd Command) !osal.Job {
// make sure we have hero in the hostnode of self
self.hero_copy()!
mut rt := RunTime.bash
scriptpath := osal.cmd_to_script_path(cmd: cmd.cmd, runtime: cmd.runtime)!
if cmd.runtime == .heroscript || cmd.runtime == .herocmd {
self.hero_copy()!
}
script_basename := os.base(scriptpath)
script_path_in_container := '/tmp/${script_basename}'
self.copy(scriptpath, script_path_in_container)!
// console.print_debug("copy ${scriptpath} into container '${self.containername}'")
cmd_str := 'buildah run ${self.id} ${script_path_in_container}'
// console.print_debug(cmd_str)
if cmd.runtime == .heroscript || cmd.runtime == .herocmd {
self.hero_copy()!
}
mut j := osal.exec(
name: cmd.name
cmd: cmd_str
description: cmd.description
timeout: cmd.timeout
stdout: cmd.stdout
stdout_log: cmd.stdout_log
raise_error: cmd.raise_error
ignore_error: cmd.ignore_error
ignore_error_codes: cmd.ignore_error_codes
scriptpath: cmd.scriptpath
scriptkeep: cmd.scriptkeep
debug: cmd.debug
shell: cmd.shell
retry: cmd.retry
interactive: cmd.interactive
async: cmd.async
) or {
mut epath := pathlib.get_file(path: scriptpath, create: false)!
c := epath.read()!
return error('cannot execute:\n${c}\nerror:\n${err}')
}
return j
}

View File

@@ -1,110 +0,0 @@
module buildah
import incubaid.herolib.osal.core as osal
import incubaid.herolib.ui.console
import incubaid.herolib.virt.utils
import json
@[params]
pub struct BuildAHNewArgs {
pub mut:
herocompile bool
reset bool
default_image string = 'docker.io/ubuntu:latest'
install bool = true // make sure buildah is installed
}
// Use shared BuildPlatformType from utils
pub type BuildPlatformType = utils.BuildPlatformType
pub struct BuildAHFactory {
pub mut:
default_image string
platform BuildPlatformType
executor utils.Executor
}
pub fn new(args BuildAHNewArgs) !BuildAHFactory {
// Validate default image
validated_image := utils.validate_image_name(args.default_image) or {
return utils.new_validation_error('default_image', args.default_image, err.msg())
}
mut bahf := BuildAHFactory{
default_image: validated_image
executor: utils.buildah_exec(false)
}
if args.reset {
bahf.reset() or {
return utils.new_build_error('reset', 'factory', err.code(), err.msg(), err.msg())
}
}
// if args.herocompile {
// bahf.builder = builder.hero_compile()!
// }
return bahf
}
@[params]
pub struct BuildAhContainerNewArgs {
pub mut:
name string = 'default'
from string
delete bool = true
}
// TODO: implement, missing parts
// TODO: need to supprot a docker builder if we are on osx or windows, so we use the builders functionality as base for executing, not directly osal
pub fn (mut self BuildAHFactory) new(args_ BuildAhContainerNewArgs) !BuildAHContainer {
mut args := args_
if args.delete {
self.delete(args.name)!
}
if args.from != '' {
args.from = self.default_image
}
mut c := BuildAHContainer{
name: args.name
from: args.from
}
return c
}
fn (mut self BuildAHFactory) list() ![]BuildAHContainer {
result := self.executor.exec(['containers', '--json']) or {
return utils.new_build_error('list', 'containers', err.code(), err.msg(), err.msg())
}
return utils.parse_json_output[BuildAHContainer](result.output) or {
return utils.new_build_error('list', 'containers', 1, err.msg(), err.msg())
}
}
// delete all builders
pub fn (mut self BuildAHFactory) reset() ! {
console.print_debug('remove all buildah containers')
self.executor.exec(['rm', '-a']) or {
return utils.new_build_error('reset', 'all', err.code(), err.msg(), err.msg())
}
}
pub fn (mut self BuildAHFactory) delete(name string) ! {
if self.exists(name)! {
console.print_debug('remove ${name}')
self.executor.exec(['rm', name]) or {
return utils.new_build_error('delete', name, err.code(), err.msg(), err.msg())
}
}
}
pub fn (mut self BuildAHFactory) exists(name string) !bool {
containers := self.list()!
for container in containers {
if container.containername == name {
return true
}
}
return false
}

View File

@@ -1,38 +0,0 @@
module buildah
import incubaid.herolib.osal.core as osal
// copies the hero from host into guest and then execute the heroscript or commandline
pub fn (mut self BuildAHContainer) hero_cmd_execute(cmd string) ! {
self.hero_copy()!
self.exec(cmd: cmd, runtime: .herocmd)!
}
// send a hero play command to the buildah container
pub fn (mut self BuildAHContainer) hero_play_execute(cmd string) ! {
self.hero_copy()!
panic('implement')
}
pub fn (mut self BuildAHContainer) hero_execute_script(cmd string) ! {
self.hero_copy()!
self.exec(cmd: cmd, runtime: .heroscript)!
}
// copies the hero from host into guest
pub fn (mut self BuildAHContainer) hero_copy() ! {
// TODO: check we are on linux, check also the platformtype arm or intel, if not right platform then build hero in container
panic('implement')
// if !osal.cmd_exists('hero') {
// herolib.hero_compile()!
// }
heropath := osal.cmd_path('hero')!
self.copy(heropath, '/usr/local/bin/hero')!
}
// get a container where we build hero and export hero from the container so we can use it for hero_copy
pub fn (mut self BuildAHContainer) hero_build() ! {
panic('implement')
}

View File

@@ -1,113 +0,0 @@
module buildah
struct BuilderInfo {
type_ string @[json: 'Type']
from_image string @[json: 'FromImage']
from_image_id string @[json: 'FromImageID']
from_image_digest string @[json: 'FromImageDigest']
group_add []string @[json: 'GroupAdd']
config string @[json: 'Config']
manifest string @[json: 'Manifest']
container string @[json: 'Container']
container_id string @[json: 'ContainerID']
mount_point string @[json: 'MountPoint']
process_label string @[json: 'ProcessLabel']
mount_label string @[json: 'MountLabel']
image_annotations ImageAnnotations @[json: 'ImageAnnotations']
image_created_by string @[json: 'ImageCreatedBy']
oci_v1 OCIv1 @[json: 'OCIv1']
docker Docker @[json: 'Docker']
default_mounts_file_path string @[json: 'DefaultMountsFilePath']
isolation string @[json: 'Isolation']
namespace_options []NamespaceOption @[json: 'NamespaceOptions']
capabilities []string @[json: 'Capabilities']
configure_network string @[json: 'ConfigureNetwork']
// cni_plugin_path string @[json: 'CNIPluginPath']
// cni_config_dir string @[json: 'CNIConfigDir']
// id_mapping_options IDMappingOptions @[json: 'IDMappingOptions']
history []string @[json: 'History']
devices []string @[json: 'Devices']
}
struct ImageAnnotations {
org_opencontainers_image_base_digest string @[json: 'org.opencontainers.image.base.digest']
org_opencontainers_image_base_name string @[json: 'org.opencontainers.image.base.name']
}
struct OCIv1 {
created string @[json: 'created']
architecture string @[json: 'architecture']
os string @[json: 'os']
config map[string]string @[json: 'config']
rootfs Rootfs @[json: 'rootfs']
}
struct Rootfs {
type_ string @[json: 'type']
diff_ids []string @[json: 'diff_ids']
}
struct Docker {
created string @[json: 'created']
container_config ContainerConfig @[json: 'container_config']
config DockerConfig @[json: 'config']
architecture string @[json: 'architecture']
os string @[json: 'os']
}
struct ContainerConfig {
hostname string @[json: 'Hostname']
domainname string @[json: 'Domainname']
user string @[json: 'User']
attach_stdin bool @[json: 'AttachStdin']
attach_stdout bool @[json: 'AttachStdout']
attach_stderr bool @[json: 'AttachStderr']
tty bool @[json: 'Tty']
open_stdin bool @[json: 'OpenStdin']
stdin_once bool @[json: 'StdinOnce']
env []string @[json: 'Env']
cmd []string @[json: 'Cmd']
image string @[json: 'Image']
volumes map[string]string @[json: 'Volumes']
working_dir string @[json: 'WorkingDir']
entrypoint []string @[json: 'Entrypoint']
on_build []string @[json: 'OnBuild']
labels map[string]string @[json: 'Labels']
}
struct DockerConfig {
// Assuming identical structure to ContainerConfig
// Define fields with @json: mapping if different
}
struct NamespaceOption {
name string @[json: 'Name']
host bool @[json: 'Host']
path string @[json: 'Path']
}
// struct IDMappingOptions {
// host_uid_mapping bool @[json: 'HostUIDMapping']
// host_gid_mapping bool @[json: 'HostGIDMapping']
// // uid_map []UIDMap @[json: 'UIDMap']
// // gid_map []GIDMap @[json: 'GIDMap']
// auto_user_ns bool @[json: 'AutoUserNs']
// auto_user_ns_opts AutoUserNsOpts @[json: 'AutoUserNsOpts']
// }
// struct UIDMap {
// // Define the structure with @json: mappings
// }
// struct GIDMap {
// // Define the structure with @json: mappings
// }
// struct AutoUserNsOpts {
// size int @[json: 'Size']
// initial_size int @[json: 'InitialSize']
// passwd_file string @[json: 'PasswdFile']
// group_file string @[json: 'GroupFile']
// additional_uid_mappings []UIDMap @[json: 'AdditionalUIDMappings']
// additional_gid_mappings []GIDMap @[json: 'AdditionalGIDMappings']
// }

View File

@@ -1,175 +0,0 @@
module buildah
import incubaid.herolib.osal.core as osal
import incubaid.herolib.ui.console
import os
@[params]
pub struct GetArgs {
pub mut:
reset bool
}
// builder machine based on arch and install vlang
// TODO need to change, go to ubuntu
pub fn builder_base(args GetArgs) !BuildAHContainer {
name := 'base'
console.print_header('buildah base build')
mut builder := e.builder_new(name: name, from: 'scratch', delete: true)!
mount_path := builder.mount_to_path()!
if mount_path.len < 4 {
return error('mount_path needs to be +4 chars')
}
self.exec(
cmd: '
export MOUNT_PATH=\'${mount_path}\'
mmdebstrap --variant=minbase --components="main,universe" --include="apt,base-files,base-passwd,bash,coreutils" noble \${MOUNT_PATH}
echo "Binding essential directories..."
mount --bind /dev "\${MOUNT_PATH}/dev"
mount --bind /proc "\${MOUNT_PATH}/proc"
mount --bind /sys "\${MOUNT_PATH}/sys"
echo "tzdata tzdata/Areas select Europe" | chroot "\${MOUNT_PATH}" debconf-set-selections
echo "tzdata tzdata/Zones/Europe select Brussels" | chroot "\${MOUNT_PATH}" debconf-set-selections
chroot \${MOUNT_PATH} apt update
# Set up APT for non-interactive installation
export DEBIAN_FRONTEND=noninteractive
export DEBCONF_NONINTERACTIVE_SEEN=true
# Update package lists
echo "Updating package lists in chroot..."
chroot \${MOUNT_PATH} apt-get update -yq
# Install required packages
echo "Installing essential packages..."
chroot \${MOUNT_PATH} apt-get install -yq screen bash coreutils curl mc unzip sudo which openssh-client openssh-server redis wget
echo "Cleaning up..."
umount "\${MOUNT_PATH}/dev" || true
umount "\${MOUNT_PATH}/proc" || true
umount "\${MOUNT_PATH}/sys" || true
'
)!
builder.install_zinit()!
// builder.set_entrypoint('redis-server')!
builder.commit('localhost/${name}')!
return builder
}
// TODO: all below are not good, need to use play cmd over hero remotely. see how we did it with core_installers
// // builder machine based on arch and install vlang
// pub fn (mut e CEngine) builder_go_rust(args GetArgs) !BuildAHContainer {
// console.print_header('buildah builder go rust')
// name := 'builder_go_rust'
// e.builder_base(reset: false)!
// if !args.reset && e.builder_exists(name)! {
// return e.builder_get(name)!
// }
// mut builder := e.builder_new(name: name, from: 'localhost/base', delete: true)!
// builder.hero_execute_cmd('installers -n golang,rust')!
// // builder.clean()!
// builder.commit('localhost/${name}')!
// e.load()!
// return builder
// }
// pub fn (mut e CEngine) builder_js(args GetArgs) !BuildAHContainer {
// console.print_header('buildah builder js')
// name := 'builder_js'
// e.builder_base(reset: false)!
// if !args.reset && e.builder_exists(name)! {
// return e.builder_get(name)!
// }
// mut builder := e.builder_new(name: name, from: 'localhost/base', delete: true)!
// builder.hero_execute_cmd('installers -n nodejs')!
// // builder.clean()!
// builder.commit('localhost/${name}')!
// e.load()!
// return builder
// }
// pub fn (mut e CEngine) builder_js_python(args GetArgs) !BuildAHContainer {
// console.print_header('buildah builder js python')
// name := 'builder_js_python'
// e.builder_js(reset: false)!
// if !args.reset && e.builder_exists(name)! {
// return e.builder_get(name)!
// }
// mut builder := e.builder_new(name: name, from: 'localhost/builder_js', delete: true)!
// builder.hero_execute_cmd('installers -n python')!
// // builder.clean()!
// builder.commit('localhost/${name}')!
// e.load()!
// return builder
// }
// pub fn (mut e CEngine) builder_hero(args GetArgs) !BuildAHContainer {
// console.print_header('buildah builder hero dev')
// name := 'builder_hero'
// e.builder_js_python(reset: false)!
// if !args.reset && e.builder_exists(name)! {
// return e.builder_get(name)!
// }
// mut builder := e.builder_new(name: name, from: 'localhost/builder_js_python', delete: true)!
// builder.hero_execute_cmd('installers -n hero')!
// // builder.clean()!
// builder.commit('localhost/${name}')!
// e.load()!
// return builder
// }
// pub fn (mut e CEngine) builder_herodev(args GetArgs) !BuildAHContainer {
// console.print_header('buildah builder hero dev')
// name := 'builder_herodev'
// e.builder_js_python(reset: false)!
// if !args.reset && e.builder_exists(name)! {
// return e.builder_get(name)!
// }
// mut builder := e.builder_new(name: name, from: 'localhost/builder_hero', delete: true)!
// builder.hero_execute_cmd('installers -n herodev')!
// // builder.clean()!
// builder.commit('localhost/${name}')!
// e.load()!
// return builder
// }
// pub fn (mut e CEngine) builder_heroweb(args GetArgs) !BuildAHContainer {
// console.print_header('buildah builder hero web')
// name := 'builder_heroweb'
// e.builder_go_rust(reset: false)!
// e.builder_hero(reset: false)!
// if !args.reset && e.builder_exists(name)! {
// return e.builder_get(name)!
// }
// mut builder0 := e.builder_new(
// name: 'builder_heroweb_temp'
// from: 'localhost/builder_go_rust'
// delete: true
// )!
// builder0.hero_execute_cmd('installers -n heroweb')!
// // builder0.hero_execute_cmd("installers -n heroweb")!
// mpath := builder0.mount_to_path()!
// // copy the built binary to host
// self.exec(
// cmd: '
// mkdir -p ${os.home_dir()}/hero/var/bin
// cp ${mpath}/usr/local/bin/* ${os.home_dir()}/hero/var/bin/
// '
// )!
// builder0.delete()!
// mut builder2 := e.builder_new(name: name, from: 'localhost/builder_hero', delete: true)!
// builder2.copy('${os.home_dir()}/hero/var/bin/', '/usr/local/bin/')!
// builder2.commit('localhost/${name}')!
// e.load()!
// return builder2
// }

View File

@@ -1,114 +0,0 @@
# Herocontainers
Tools to work with containers
```go
#!/usr/bin/env -S v -n -cg -w -enable-globals run
import incubaid.herolib.virt.herocontainers
import incubaid.herolib.ui.console
import incubaid.herolib.builder
//interative means will ask for login/passwd
console.print_header("BUILDAH Demo.")
//if herocompile on, then will forced compile hero, which might be needed in debug mode for hero
// to execute hero scripts inside build container
mut factory:=herocontainers.new(herocompile=true)!
//mut b:=factory.builder_new(name:"test")!
//create
factory.builderv_create()!
//get the container
//mut b2:=factory.builder_get("builderv")!
//b2.shell()!
```
## buildah tricks
```bash
#find the containers as have been build, these are the active ones you can work with
buildah ls
#see the images
buildah images
```
result is something like
```bash
CONTAINER ID BUILDER IMAGE ID IMAGE NAME CONTAINER NAME
a9946633d4e7 * scratch base
86ff0deb00bf * 4feda76296d6 localhost/builder:latest base_go_rust
```
some tricks
```bash
#run interactive in one (here we chose the builderv one)
buildah run --terminal --env TERM=xterm base /bin/bash
#or
buildah run --terminal --env TERM=xterm default /bin/bash
#or
buildah run --terminal --env TERM=xterm base_go_rust /bin/bash
```
to check inside the container about diskusage
```bash
apt install ncdu
ncdu
```
## create container
```go
import incubaid.herolib.virt.herocontainers
import incubaid.herolib.ui.console
import incubaid.herolib.builder
//interative means will ask for login/passwd
console.print_header("Get a container.")
mut e:=herocontainers.new()!
//info see https://docs.podman.io/en/latest/markdown/podman-run.1.html
mut c:=e.container_create(
name: 'mycontainer'
image_repo: 'ubuntu'
// Resource limits
memory: '1g'
cpus: 0.5
// Network config
network: 'bridge'
network_aliases: ['myapp', 'api']
// DNS config
dns_servers: ['8.8.8.8', '8.8.4.4']
dns_search: ['example.com']
interactive: true // Keep STDIN open
mounts: [
'type=bind,src=/data,dst=/container/data,ro=true'
]
volumes: [
'/config:/etc/myapp:ro'
]
published_ports: [
'127.0.0.1:8080:80'
]
)!
```
## future
should make this module compatible with <https://github.com/containerd/nerdctl>

View File

@@ -1,13 +0,0 @@
!!hero_code.generate_installer
name:'daguserver'
classname:'DaguInstaller'
singleton:1
templates:1
default:1
title:''
supported_platforms:''
reset:0
startupmanager:1
hasconfig:1
build:0

View File

@@ -1,157 +0,0 @@
module daguserver
import incubaid.herolib.osal.core as osal
import incubaid.herolib.ui.console
import incubaid.herolib.core.texttools
import incubaid.herolib.core
import incubaid.herolib.core.httpconnection
import incubaid.herolib.installers.ulist
// import incubaid.herolib.develop.gittools
import incubaid.herolib.osal.startupmanager
import incubaid.herolib.libarchive.zinit as zinit_lib
import os
fn startupcmd() ![]startupmanager.ZProcessNewArgs {
mut res := []startupmanager.ZProcessNewArgs{}
mut cfg := get()!
res << startupmanager.ZProcessNewArgs{
name: 'dagu'
cmd: 'dagu server'
env: {
'HOME ': os.home_dir()
'DAGU_HOME ': cfg.configpath // config for dagu is called admin.yml and is in this dir
}
}
res << startupmanager.ZProcessNewArgs{
name: 'dagu_scheduler'
cmd: 'dagu scheduler'
env: {
'HOME ': os.home_dir()
'DAGU_HOME ': cfg.configpath
}
}
return res
}
fn running() !bool {
mut cfg := get()!
url := 'http://${cfg.host}:${cfg.port}/api/v1'
mut conn := httpconnection.new(name: 'dagu', url: url)!
if cfg.secret.len > 0 {
conn.default_header.add(.authorization, 'Bearer ${cfg.secret}')
}
console.print_debug("curl -X 'GET' '${url}'/tags --oauth2-bearer ${cfg.secret}")
r := conn.get_json_dict(prefix: 'tags', debug: false) or { return false }
tags := r['Tags'] or { return false }
console.print_debug(tags)
console.print_debug('Dagu is answering.')
return true
}
fn start_pre() ! {
}
fn start_post() ! {
}
fn stop_pre() ! {
}
fn stop_post() ! {
}
//////////////////// following actions are not specific to instance of the object
// checks if a certain version or above is installed
fn installed() !bool {
res := os.execute('dagu version')
if res.exit_code == 0 {
r := res.output.split_into_lines().filter(it.trim_space().len > 0)
if r.len != 1 {
return error("couldn't parse dagu version.\n${res.output}")
}
if texttools.version(version) > texttools.version(r[0]) {
return false
}
} else {
return false
}
return true
}
// get the Upload List of the files
fn ulist_get() !ulist.UList {
// optionally build a UList which is all paths which are result of building, is then used e.g. in upload
return ulist.UList{}
}
// uploads to S3 server if configured
fn upload() ! {}
fn install() ! {
console.print_header('install daguserver')
mut url := ''
if core.is_linux_arm()! {
url = 'https://github.com/dagu-dev/dagu/releases/download/v${version}/dagu_${version}_linux_arm64.tar.gz'
} else if core.is_linux_intel()! {
url = 'https://github.com/dagu-dev/dagu/releases/download/v${version}/dagu_${version}_linux_amd64.tar.gz'
} else if core.is_osx_arm()! {
url = 'https://github.com/dagu-dev/dagu/releases/download/v${version}/dagu_${version}_darwin_arm64.tar.gz'
} else if core.is_osx_intel()! {
url = 'https://github.com/dagu-dev/dagu/releases/download/v${version}/dagu_${version}_darwin_amd64.tar.gz'
} else {
return error('unsported platform')
}
mut dest := osal.download(
url: url
minsize_kb: 9000
expand_dir: '/tmp/dagu'
)!
mut binpath := dest.file_get('dagu')!
osal.cmd_add(
cmdname: 'dagu'
source: binpath.path
)!
}
fn destroy() ! {
cmd := '
systemctl disable daguserver_scheduler.service
systemctl disable daguserver.service
systemctl stop daguserver_scheduler.service
systemctl stop daguserver.service
systemctl list-unit-files | grep daguserver
pkill -9 -f daguserver
ps aux | grep daguserver
'
osal.execute_silent(cmd) or {}
// mut zinit_factory := zinit_lib.Zinit{}
// if zinit_factory.exists('dagu') {
// zinit_factory.stop('dagu')! or { return error('Could not stop dagu service due to: ${err}') }
// zinit_factory.delete('dagu')! or {
// return error('Could not delete dagu service due to: ${err}')
// }
// }
// if zinit_factory.exists('dagu_scheduler') {
// zinit_factory.stop('dagu_scheduler')! or {
// return error('Could not stop dagu_scheduler service due to: ${err}')
// }
// zinit_factory.delete('dagu_scheduler')! or {
// return error('Could not delete dagu_scheduler service due to: ${err}')
// }
// }
}

View File

@@ -1,299 +0,0 @@
module daguserver
import incubaid.herolib.core.base
import incubaid.herolib.core.playbook { PlayBook }
import incubaid.herolib.ui.console
import json
import incubaid.herolib.osal.startupmanager
import time
__global (
daguserver_global map[string]&DaguInstaller
daguserver_default string
)
/////////FACTORY
@[params]
pub struct ArgsGet {
pub mut:
name string = 'default'
fromdb bool // will load from filesystem
create bool // default will not create if not exist
}
pub fn new(args ArgsGet) !&DaguInstaller {
mut obj := DaguInstaller{
name: args.name
}
set(obj)!
return get(name: args.name)!
}
pub fn get(args ArgsGet) !&DaguInstaller {
mut context := base.context()!
daguserver_default = args.name
if args.fromdb || args.name !in daguserver_global {
mut r := context.redis()!
if r.hexists('context:daguserver', args.name)! {
data := r.hget('context:daguserver', args.name)!
if data.len == 0 {
return error('DaguInstaller with name: daguserver does not exist, prob bug.')
}
mut obj := json.decode(DaguInstaller, data)!
set_in_mem(obj)!
} else {
if args.create {
new(args)!
} else {
return error("DaguInstaller with name 'daguserver' does not exist")
}
}
return get(name: args.name)! // no longer from db nor create
}
return daguserver_global[args.name] or {
return error('could not get config for daguserver with name:daguserver')
}
}
// register the config for the future
pub fn set(o DaguInstaller) ! {
mut o2 := set_in_mem(o)!
daguserver_default = o2.name
mut context := base.context()!
mut r := context.redis()!
r.hset('context:daguserver', o2.name, json.encode(o2))!
}
// does the config exists?
pub fn exists(args ArgsGet) !bool {
mut context := base.context()!
mut r := context.redis()!
return r.hexists('context:daguserver', args.name)!
}
pub fn delete(args ArgsGet) ! {
mut context := base.context()!
mut r := context.redis()!
r.hdel('context:daguserver', args.name)!
}
@[params]
pub struct ArgsList {
pub mut:
fromdb bool // will load from filesystem
}
// if fromdb set: load from filesystem, and not from mem, will also reset what is in mem
pub fn list(args ArgsList) ![]&DaguInstaller {
mut res := []&DaguInstaller{}
mut context := base.context()!
if args.fromdb {
// reset what is in mem
daguserver_global = map[string]&DaguInstaller{}
daguserver_default = ''
}
if args.fromdb {
mut r := context.redis()!
mut l := r.hkeys('context:daguserver')!
for name in l {
res << get(name: name, fromdb: true)!
}
return res
} else {
// load from memory
for _, client in daguserver_global {
res << client
}
}
return res
}
// only sets in mem, does not set as config
fn set_in_mem(o DaguInstaller) !DaguInstaller {
mut o2 := obj_init(o)!
daguserver_global[o2.name] = &o2
daguserver_default = o2.name
return o2
}
pub fn play(mut plbook PlayBook) ! {
if !plbook.exists(filter: 'daguserver.') {
return
}
mut install_actions := plbook.find(filter: 'daguserver.configure')!
if install_actions.len > 0 {
for install_action in install_actions {
heroscript := install_action.heroscript()
mut obj2 := heroscript_loads(heroscript)!
set(obj2)!
}
}
mut other_actions := plbook.find(filter: 'daguserver.')!
for other_action in other_actions {
if other_action.name in ['destroy', 'install', 'build'] {
mut p := other_action.params
reset := p.get_default_false('reset')
if other_action.name == 'destroy' || reset {
console.print_debug('install action daguserver.destroy')
destroy()!
}
if other_action.name == 'install' {
console.print_debug('install action daguserver.install')
install()!
}
}
if other_action.name in ['start', 'stop', 'restart'] {
mut p := other_action.params
name := p.get('name')!
mut daguserver_obj := get(name: name)!
console.print_debug('action object:\n${daguserver_obj}')
if other_action.name == 'start' {
console.print_debug('install action daguserver.${other_action.name}')
daguserver_obj.start()!
}
if other_action.name == 'stop' {
console.print_debug('install action daguserver.${other_action.name}')
daguserver_obj.stop()!
}
if other_action.name == 'restart' {
console.print_debug('install action daguserver.${other_action.name}')
daguserver_obj.restart()!
}
}
}
}
////////////////////////////////////////////////////////////////////////////////////////////////////
//////////////////////////# LIVE CYCLE MANAGEMENT FOR INSTALLERS ///////////////////////////////////
////////////////////////////////////////////////////////////////////////////////////////////////////
fn startupmanager_get(cat startupmanager.StartupManagerType) !startupmanager.StartupManager {
// unknown
// screen
// zinit
// tmux
// systemd
match cat {
.screen {
console.print_debug('startupmanager: zinit')
return startupmanager.get(.screen)!
}
.zinit {
console.print_debug('startupmanager: zinit')
return startupmanager.get(.zinit)!
}
.systemd {
console.print_debug('startupmanager: systemd')
return startupmanager.get(.systemd)!
}
else {
console.print_debug('startupmanager: auto')
return startupmanager.get(.auto)!
}
}
}
// load from disk and make sure is properly intialized
pub fn (mut self DaguInstaller) reload() ! {
self = obj_init(self)!
}
pub fn (mut self DaguInstaller) start() ! {
if self.running()! {
return
}
console.print_header('daguserver start')
if !installed()! {
install()!
}
configure()!
start_pre()!
for zprocess in startupcmd()! {
mut sm := startupmanager_get(zprocess.startuptype)!
console.print_debug('starting daguserver with ${zprocess.startuptype}...')
sm.new(zprocess)!
sm.start(zprocess.name)!
}
start_post()!
for _ in 0 .. 50 {
if self.running()! {
return
}
time.sleep(100 * time.millisecond)
}
return error('daguserver did not install properly.')
}
pub fn (mut self DaguInstaller) install_start(args InstallArgs) ! {
switch(self.name)
self.install(args)!
self.start()!
}
pub fn (mut self DaguInstaller) stop() ! {
switch(self.name)
stop_pre()!
for zprocess in startupcmd()! {
mut sm := startupmanager_get(zprocess.startuptype)!
sm.stop(zprocess.name)!
}
stop_post()!
}
pub fn (mut self DaguInstaller) restart() ! {
switch(self.name)
self.stop()!
self.start()!
}
pub fn (mut self DaguInstaller) running() !bool {
switch(self.name)
// walk over the generic processes, if not running return
for zprocess in startupcmd()! {
if zprocess.startuptype != .screen {
mut sm := startupmanager_get(zprocess.startuptype)!
r := sm.running(zprocess.name)!
if r == false {
return false
}
}
}
return running()!
}
@[params]
pub struct InstallArgs {
pub mut:
reset bool
}
pub fn (mut self DaguInstaller) install(args InstallArgs) ! {
switch(self.name)
if args.reset || (!installed()!) {
install()!
}
}
pub fn (mut self DaguInstaller) destroy() ! {
switch(self.name)
self.stop() or {}
destroy()!
}
// switch instance to be used for daguserver
pub fn switch(name string) {
}

View File

@@ -1,79 +0,0 @@
module daguserver
import incubaid.herolib.data.encoderhero
import incubaid.herolib.crypt.secrets
import incubaid.herolib.ui.console
import incubaid.herolib.core.pathlib
import os
pub const version = '1.14.3'
const singleton = true
const default = true
pub const homedir = os.home_dir()
// THIS THE THE SOURCE OF THE INFORMATION OF THIS FILE, HERE WE HAVE THE CONFIG OBJECT CONFIGURED AND MODELLED
@[heap]
pub struct DaguInstaller {
pub mut:
name string = 'default'
dagsdir string = '${os.home_dir()}/.dagu'
configpath string = '${os.home_dir()}/.config/dagu'
username string
password string @[secret]
secret string @[secret]
title string
host string = 'localhost'
port int = 8014
}
// your checking & initialization code if needed
fn obj_init(mycfg_ DaguInstaller) !DaguInstaller {
mut mycfg := mycfg_
return mycfg
}
// called before start if done
fn configure() ! {
mut cfg := get()!
if cfg.password == '' {
cfg.password = secrets.hex_secret()!
}
// TODO:use DAGU_SECRET from env variables in os if not set then empty string
if cfg.secret == '' {
cfg.secret = secrets.openssl_hex_secret(input: cfg.password)!
}
if cfg.dagsdir == '' {
cfg.dagsdir = '${homedir}/.dagu'
}
if cfg.configpath == '' {
cfg.configpath = '${homedir}/.config/dagu'
}
if cfg.host == '' {
cfg.host = 'localhost'
}
if cfg.port == 0 {
cfg.port = 8014
}
mut mycode := $tmpl('templates/dagu.yaml')
mut path := pathlib.get_file(path: '${cfg.configpath}/admin.yaml', create: true)!
path.write(mycode)!
console.print_debug(mycode)
}
/////////////NORMALLY NO NEED TO TOUCH
pub fn heroscript_dumps(obj DaguInstaller) !string {
return encoderhero.encode[DaguInstaller](obj)!
}
pub fn heroscript_loads(heroscript string) !DaguInstaller {
mut obj := encoderhero.decode[DaguInstaller](heroscript)!
return obj
}

View File

@@ -1,31 +0,0 @@
# daguserver
To get started
```v
import incubaid.herolib.installers.something. daguserver
mut installer:= daguserver.get()!
installer.start()!
```
## example heroscript
```hero
!!daguserver.install
homedir: '/home/user/daguserver'
username: 'admin'
password: 'secretpassword'
title: 'Some Title'
host: 'localhost'
port: 8888
```

View File

@@ -1,29 +0,0 @@
# directory path to save logs from standard output
logDir: @{config.log_dir}
# history retention days (default: 30)
histRetentionDays: @{config.history_retention_days}
# Email notification settings
mailOn:
failure: @{config.mail_on.failure}
success: @{config.mail_on.success}
# SMTP server settings
smtp:
host: @{config.smtp.host}
port: @{config.smtp.port}
username: @{config.smtp.username}
password: @{config.smtp.password}
# Error mail configuration
errorMail:
from: @{config.error_mail.from}
to: @{config.error_mail.to}
prefix: @{config.error_mail.prefix}
# Info mail configuration
infoMail:
from: @{config.info_mail.from}
to: @{config.info_mail.to}
prefix: @{config.info_mail.prefix}

View File

@@ -1,27 +0,0 @@
host: "${cfg.host}" # default: 127.0.0.1
port: ${cfg.port}
# path to the DAGs directory
dags: ${cfg.dagsdir}
# Web UI Color & Title
# navbarColor: <ui header color> # header color for web UI (e.g. "#ff0000")
navbarTitle: ${cfg.title} # header title for web UI (e.g. "PROD")
isBasicAuth: true
basicAuthUsername: ${cfg.username}
basicAuthPassword: ${cfg.password}
isAuthToken: true # enables API token
authToken: ${cfg.secret}
# Base Config
# baseConfig:
# Working Directory
# workDir: # default: DAG location
# SSL Configuration
# tls:
# certFile: <path to SSL certificate file>
# keyFile: <path to SSL key file>

View File

@@ -1,107 +0,0 @@
# DedupeStore
DedupeStore is a content-addressable key-value store with built-in deduplication. It uses blake2b-160 content hashing to identify and deduplicate data, making it ideal for storing files or data blocks where the same content might appear multiple times.
## Features
- Content-based deduplication using blake2b-160 hashing
- Efficient storage using RadixTree for hash lookups
- Persistent storage using OurDB
- Maximum value size limit of 1MB
- Fast retrieval of data using content hash
- Automatic deduplication of identical content
## Usage
```v
import incubaid.herolib.data.dedupestor
// Create a new dedupestore
mut ds := dedupestor.new(
path: 'path/to/store'
reset: false // Set to true to reset existing data
)!
// Store some data
data := 'Hello, World!'.bytes()
hash := ds.store(data)!
println('Stored data with hash: ${hash}')
// Retrieve data using hash
retrieved := ds.get(hash)!
println('Retrieved data: ${retrieved.bytestr()}')
// Check if data exists
exists := ds.exists(hash)
println('Data exists: ${exists}')
// Attempting to store the same data again returns the same hash
same_hash := ds.store(data)!
assert hash == same_hash // True, data was deduplicated
```
## Implementation Details
DedupeStore uses two main components for storage:
1. **RadixTree**: Stores mappings from content hashes to data location IDs
2. **OurDB**: Stores the actual data blocks
When storing data:
1. The data is hashed using blake2b-160
2. If the hash exists in the RadixTree, the existing data location is returned
3. If the hash is new:
- Data is stored in OurDB, getting a new location ID
- Hash -> ID mapping is stored in RadixTree
- The hash is returned
When retrieving data:
1. The RadixTree is queried with the hash to get the data location ID
2. The data is retrieved from OurDB using the ID
## Size Limits
- Maximum value size: 1MB
- Attempting to store larger values will result in an error
## the reference field
In the dedupestor system, the Reference struct is defined with two fields:
```v
pub struct Reference {
pub:
owner u16
id u32
}
```
The purpose of the id field in this context is to serve as an identifier within a specific owner's domain. Here's what each field represents:
owner (u16): Identifies which entity or system component "owns" or is referencing the data. This could represent different applications, users, or subsystems that are using the dedupestor.
id (u32): A unique identifier within that owner's domain. This allows each owner to have their own independent numbering system for referencing stored data.
Together, the {owner: 1, id: 100} combination creates a unique reference that:
Tracks which entities are referencing a particular piece of data
Allows the system to know when data can be safely deleted (when no references remain)
Provides a way for different components to maintain their own ID systems without conflicts
The dedupestor uses these references to implement a reference counting mechanism. When data is stored, a reference is attached to it. When all references to a piece of data are removed (via the delete method), the actual data can be safely deleted from storage.
This design allows for efficient deduplication - if the same data is stored multiple times with different references, it's only physically stored once, but the system keeps track of all the references to it.
## Testing
The module includes comprehensive tests covering:
- Basic store/retrieve operations
- Deduplication functionality
- Size limit enforcement
- Edge cases
Run tests with:
```bash
v test lib/data/dedupestor/

Some files were not shown because too many files have changed in this diff Show More