Merge branch 'development' of github.com:incubaid/herolib into development
This commit is contained in:
32
.github/workflows/README.md
vendored
Normal file
32
.github/workflows/README.md
vendored
Normal file
@@ -0,0 +1,32 @@
|
|||||||
|
# Building Hero for release
|
||||||
|
|
||||||
|
Generally speaking, our scripts and docs for building hero produce non portable binaries for Linux. While that's fine for development purposes, statically linked binaries are much more convenient for releases and distribution.
|
||||||
|
|
||||||
|
The release workflow here creates a static binary for Linux using an Alpine container. A few notes follow about how that's done.
|
||||||
|
|
||||||
|
## Static builds in vlang
|
||||||
|
|
||||||
|
Since V compiles to C in our case, we are really concerned with how to produce static C builds. The V project provides [some guidance](https://github.com/vlang/v?tab=readme-ov-file#docker-with-alpinemusl) on using an Alpine container and passing `-cflags -static` to the V compiler.
|
||||||
|
|
||||||
|
That's fine for some projects. Hero has a dependency on the `libpq` C library for Postgres functionality, however, and this creates a complication.
|
||||||
|
|
||||||
|
## Static linking libpq
|
||||||
|
|
||||||
|
In order to create a static build of hero on Alpine, we need to install some additional packages:
|
||||||
|
|
||||||
|
* openssl-libs-static
|
||||||
|
* postgresql-dev
|
||||||
|
|
||||||
|
The full `apk` command to prepare the container for building looks like this:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
apk add --no-cache bash git build-base openssl-dev libpq-dev postgresql-dev openssl-libs-static
|
||||||
|
```
|
||||||
|
|
||||||
|
Then we also need to instruct the C compiler to link against the Postgres static shared libraries. Here's the build command:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
v -w -d use_openssl -enable-globals -cc gcc -cflags -static -ldflags "-lpgcommon_shlib -lpgport_shlib" cli/hero.v
|
||||||
|
```
|
||||||
|
|
||||||
|
Note that gcc is also the preferred compiler for static builds.
|
||||||
35
.github/workflows/hero_build.yml
vendored
35
.github/workflows/hero_build.yml
vendored
@@ -35,9 +35,6 @@ jobs:
|
|||||||
- name: Checkout code
|
- name: Checkout code
|
||||||
uses: actions/checkout@v4
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
# We do the workaround as described here https://github.com/Incubaid/herolib?tab=readme-ov-file#tcc-compiler-error-on-macos
|
|
||||||
# gcc and clang also don't work on macOS due to https://github.com/vlang/v/issues/25467
|
|
||||||
# We can change the compiler or remove this when one is fixed
|
|
||||||
- name: Setup V & Herolib
|
- name: Setup V & Herolib
|
||||||
id: setup
|
id: setup
|
||||||
shell: bash
|
shell: bash
|
||||||
@@ -53,52 +50,34 @@ jobs:
|
|||||||
echo "Herolib symlink created to $(pwd)/lib"
|
echo "Herolib symlink created to $(pwd)/lib"
|
||||||
timeout-minutes: 10
|
timeout-minutes: 10
|
||||||
|
|
||||||
# We can't make static builds for Linux easily, since we link to libql
|
# For Linux, we build a static binary linked against musl on Alpine. For
|
||||||
# (Postgres) and this has no static version available in the Alpine
|
# static linking, gcc is preferred
|
||||||
# repos. Therefore we build dynamic binaries for both glibc and musl.
|
|
||||||
#
|
|
||||||
# Again we work around a bug limiting our choice of C compiler tcc won't
|
|
||||||
# work on Alpine due to https://github.com/vlang/v/issues/24866
|
|
||||||
# So always use gcc for Linux
|
|
||||||
#
|
|
||||||
# For macOS, we can only use tcc (see above), but then we hit issues using
|
|
||||||
# the garbage collector, so disable that
|
|
||||||
- name: Build Hero
|
- name: Build Hero
|
||||||
timeout-minutes: 15
|
timeout-minutes: 15
|
||||||
run: |
|
run: |
|
||||||
set -e
|
set -ex
|
||||||
if [ "${{ runner.os }}" = "Linux" ]; then
|
if [ "${{ runner.os }}" = "Linux" ]; then
|
||||||
sudo apt-get install libpq-dev
|
|
||||||
# Build for glibc
|
|
||||||
v -w -d use_openssl -enable-globals -cc gcc cli/hero.v -o cli/hero-${{ matrix.target }}
|
|
||||||
|
|
||||||
# Build for musl using Alpine in Docker
|
# Build for musl using Alpine in Docker
|
||||||
docker run --rm \
|
docker run --rm \
|
||||||
-v ${{ github.workspace }}/lib:/root/.vmodules/incubaid/herolib \
|
-v ${{ github.workspace }}/lib:/root/.vmodules/incubaid/herolib \
|
||||||
-v ${{ github.workspace }}:/herolib \
|
-v ${{ github.workspace }}:/herolib \
|
||||||
-w /herolib \
|
-w /herolib \
|
||||||
alpine \
|
alpine:3.22 \
|
||||||
sh -c '
|
sh -c '
|
||||||
apk add --no-cache bash git build-base openssl-dev libpq-dev
|
set -ex
|
||||||
|
apk add --no-cache bash git build-base openssl-dev libpq-dev postgresql-dev openssl-libs-static
|
||||||
cd v
|
cd v
|
||||||
make clean
|
make clean
|
||||||
make
|
make
|
||||||
./v symlink
|
./v symlink
|
||||||
cd ..
|
cd ..
|
||||||
v -w -d use_openssl -enable-globals -cc gcc cli/hero.v -o cli/hero-${{ matrix.target }}-musl
|
v -w -d use_openssl -enable-globals -cc gcc -cflags -static -ldflags "-lpgcommon_shlib -lpgport_shlib" cli/hero.v -o cli/hero-${{ matrix.target }}-musl
|
||||||
'
|
'
|
||||||
|
|
||||||
else
|
else
|
||||||
v -w -d use_openssl -enable-globals -cc clang cli/hero.v -o cli/hero-${{ matrix.target }}
|
v -w -d use_openssl -enable-globals -cc clang cli/hero.v -o cli/hero-${{ matrix.target }}
|
||||||
fi
|
fi
|
||||||
|
|
||||||
- name: Upload glibc binary
|
|
||||||
if: runner.os == 'Linux'
|
|
||||||
uses: actions/upload-artifact@v4
|
|
||||||
with:
|
|
||||||
name: hero-${{ matrix.target }}
|
|
||||||
path: cli/hero-${{ matrix.target }}
|
|
||||||
|
|
||||||
- name: Upload musl binary
|
- name: Upload musl binary
|
||||||
if: runner.os == 'Linux'
|
if: runner.os == 'Linux'
|
||||||
uses: actions/upload-artifact@v4
|
uses: actions/upload-artifact@v4
|
||||||
|
|||||||
77
aiprompts/README.md
Normal file
77
aiprompts/README.md
Normal file
@@ -0,0 +1,77 @@
|
|||||||
|
# HeroLib AI Prompts (`aiprompts/`)
|
||||||
|
|
||||||
|
This directory contains AI-oriented instructions and manuals for working with the Hero tool and the `herolib` codebase.
|
||||||
|
|
||||||
|
It is the **entry point for AI agents** that generate or modify code/docs in this repository.
|
||||||
|
|
||||||
|
## Scope
|
||||||
|
|
||||||
|
- **Global rules for AI and V/Hero usage**
|
||||||
|
See:
|
||||||
|
- `herolib_start_here.md`
|
||||||
|
- `vlang_herolib_core.md`
|
||||||
|
- **Herolib core modules**
|
||||||
|
See:
|
||||||
|
- `herolib_core/` (core HeroLib modules)
|
||||||
|
- `herolib_advanced/` (advanced topics)
|
||||||
|
- **Docusaurus & Site module (Hero docs)**
|
||||||
|
See:
|
||||||
|
- `docusaurus/docusaurus_ebook_manual.md`
|
||||||
|
- `lib/web/docusaurus/README.md` (authoritative module doc)
|
||||||
|
- `lib/web/site/ai_instructions.md` and `lib/web/site/readme.md`
|
||||||
|
- **HeroModels / HeroDB**
|
||||||
|
See:
|
||||||
|
- `ai_instructions_hero_models.md`
|
||||||
|
- `heromodel_instruct.md`
|
||||||
|
- **V language & web server docs** (upstream-style, mostly language-level)
|
||||||
|
See:
|
||||||
|
- `v_core/`, `v_advanced/`
|
||||||
|
- `v_veb_webserver/`
|
||||||
|
|
||||||
|
## Sources of Truth
|
||||||
|
|
||||||
|
For any domain, **code and module-level docs are authoritative**:
|
||||||
|
|
||||||
|
- Core install & usage: `herolib/README.md`, scripts under `scripts/`
|
||||||
|
- Site module: `lib/web/site/ai_instructions.md`, `lib/web/site/readme.md`
|
||||||
|
- Docusaurus module: `lib/web/docusaurus/README.md`, `lib/web/docusaurus/*.v`
|
||||||
|
- Atlas client: `lib/data/atlas/client/README.md`
|
||||||
|
- HeroModels: `lib/hero/heromodels/*.v` + tests
|
||||||
|
|
||||||
|
`aiprompts/` files **must not contradict** these. When in doubt, follow the code / module docs first and treat prompts as guidance.
|
||||||
|
|
||||||
|
## Directory Overview
|
||||||
|
|
||||||
|
- `herolib_start_here.md` / `vlang_herolib_core.md`
|
||||||
|
Global AI rules and V/Hero basics.
|
||||||
|
- `herolib_core/` & `herolib_advanced/`
|
||||||
|
Per-module instructions for core/advanced HeroLib features.
|
||||||
|
- `docusaurus/`
|
||||||
|
AI manual for building Hero docs/ebooks with the Docusaurus + Site + Atlas pipeline.
|
||||||
|
- `instructions/`
|
||||||
|
Active, higher-level instructions (e.g. HeroDB base filesystem).
|
||||||
|
- `instructions_archive/`
|
||||||
|
**Legacy / historical** prompt material. See `instructions_archive/README.md`.
|
||||||
|
- `todo/`
|
||||||
|
Meta design/refactor notes (not up-to-date instructions for normal usage).
|
||||||
|
- `v_core/`, `v_advanced/`, `v_veb_webserver/`
|
||||||
|
V language and web framework references used when generating V code.
|
||||||
|
- `bizmodel/`, `unpolly/`, `doctree/`, `documentor/`
|
||||||
|
Domain-specific or feature-specific instructions.
|
||||||
|
|
||||||
|
## How to Treat Legacy Material
|
||||||
|
|
||||||
|
- Content under `instructions_archive/` is **kept for reference** and may describe older flows (e.g. older documentation or prompt pipelines).
|
||||||
|
Do **not** use it as a primary source for new work unless explicitly requested.
|
||||||
|
- Some prompts mention **Doctree**; the current default docs pipeline uses **Atlas**. Doctree/`doctreeclient` is an alternative/legacy backend.
|
||||||
|
|
||||||
|
## Guidelines for AI Agents
|
||||||
|
|
||||||
|
- Always:
|
||||||
|
- Respect global rules in `herolib_start_here.md` and `vlang_herolib_core.md`.
|
||||||
|
- Prefer module docs under `lib/` when behavior or parameters differ.
|
||||||
|
- Avoid modifying generated files (e.g. `*_ .v` or other generated artifacts) as instructed.
|
||||||
|
- When instructions conflict, resolve as:
|
||||||
|
1. **Code & module docs in `lib/`**
|
||||||
|
2. **AI instructions in `aiprompts/`**
|
||||||
|
3. **Archived docs (`instructions_archive/`) only when explicitly needed**.
|
||||||
@@ -2,9 +2,9 @@
|
|||||||
|
|
||||||
## Overview
|
## Overview
|
||||||
|
|
||||||
This document provides clear instructions for AI agents to create new HeroDB models similar to `message.v`.
|
This document provides clear instructions for AI agents to create new HeroDB models similar to `message.v`.
|
||||||
These models are used to store structured data in Redis using the HeroDB system.
|
These models are used to store structured data in Redis using the HeroDB system.
|
||||||
The message.v can be found in `lib/hero/heromodels/message.v`.s
|
The `message.v` example can be found in `lib/hero/heromodels/message.v`.
|
||||||
|
|
||||||
## Key Concepts
|
## Key Concepts
|
||||||
|
|
||||||
@@ -108,7 +108,7 @@ Add your model to the ModelsFactory struct in `factory.v`:
|
|||||||
```v
|
```v
|
||||||
pub struct ModelsFactory {
|
pub struct ModelsFactory {
|
||||||
pub mut:
|
pub mut:
|
||||||
messages DBCalendar
|
calendar DBCalendar
|
||||||
// ... other models
|
// ... other models
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|||||||
@@ -1,3 +1,5 @@
|
|||||||
|
> NOTE: Atlas is the default document collections/export pipeline used by the current Docusaurus integration (see `lib/data/atlas/client`). The Doctree module described here is an alternative/legacy export mechanism that still exists but is not the primary path. Use Atlas by default unless you explicitly need Doctree.
|
||||||
|
|
||||||
# Doctree Export Specification
|
# Doctree Export Specification
|
||||||
|
|
||||||
## Overview
|
## Overview
|
||||||
|
|||||||
@@ -8,7 +8,7 @@ To effectively create ebooks with HeroLib, it's crucial to understand the interp
|
|||||||
|
|
||||||
* **HeroScript**: A concise scripting language used to define the structure, configuration, and content flow of your Docusaurus site. It acts as the declarative interface for the entire process.
|
* **HeroScript**: A concise scripting language used to define the structure, configuration, and content flow of your Docusaurus site. It acts as the declarative interface for the entire process.
|
||||||
* **Docusaurus**: A popular open-source static site generator. HeroLib uses Docusaurus as the underlying framework to render your ebook content into a navigable website.
|
* **Docusaurus**: A popular open-source static site generator. HeroLib uses Docusaurus as the underlying framework to render your ebook content into a navigable website.
|
||||||
* **Doctree**: HeroLib's content management system. Doctree organizes your markdown files into "collections" and "pages," allowing for structured content retrieval and reuse across multiple projects.
|
* **Atlas (and Doctree)**: HeroLib's document collection layer. In the current pipeline, Atlas exports markdown "collections" and "pages" that Docusaurus consumes via the Atlas client. Doctree and `doctreeclient` are legacy/alternative ways to provide the same collections.
|
||||||
|
|
||||||
## 2. Setting Up a Docusaurus Project with HeroLib
|
## 2. Setting Up a Docusaurus Project with HeroLib
|
||||||
|
|
||||||
@@ -22,18 +22,26 @@ The `docusaurus.define` HeroScript directive configures the global settings for
|
|||||||
|
|
||||||
```heroscript
|
```heroscript
|
||||||
!!docusaurus.define
|
!!docusaurus.define
|
||||||
|
name:"my_ebook" // must match the site name from !!site.config
|
||||||
path_build: "/tmp/my_ebook_build"
|
path_build: "/tmp/my_ebook_build"
|
||||||
path_publish: "/tmp/my_ebook_publish"
|
path_publish: "/tmp/my_ebook_publish"
|
||||||
production: true
|
reset: true // clean build dir before building (optional)
|
||||||
update: true
|
install: true // run bun install if needed (optional)
|
||||||
|
template_update: true // update the Docusaurus template (optional)
|
||||||
|
atlas_dir: "/tmp/atlas_export" // where Atlas exports collections
|
||||||
|
use_atlas: true // use Atlas as content backend
|
||||||
```
|
```
|
||||||
|
|
||||||
**Arguments:**
|
**Arguments:**
|
||||||
|
|
||||||
|
* `name` (string, required): The site/factory name. Must match the `name` used in `!!site.config` so Docusaurus can find the corresponding site definition.
|
||||||
* `path_build` (string, optional): The local path where the Docusaurus site will be built. Defaults to `~/hero/var/docusaurus/build`.
|
* `path_build` (string, optional): The local path where the Docusaurus site will be built. Defaults to `~/hero/var/docusaurus/build`.
|
||||||
* `path_publish` (string, optional): The local path where the final Docusaurus site will be published (e.g., for deployment). Defaults to `~/hero/var/docusaurus/publish`.
|
* `path_publish` (string, optional): The local path where the final Docusaurus site will be published (e.g., for deployment). Defaults to `~/hero/var/docusaurus/publish`.
|
||||||
* `production` (boolean, optional): If `true`, the site will be built for production (optimized). Default is `false`.
|
* `reset` (boolean, optional): If `true`, clean the build directory before starting.
|
||||||
* `update` (boolean, optional): If `true`, the Docusaurus template and dependencies will be updated. Default is `false`.
|
* `install` (boolean, optional): If `true`, run dependency installation (e.g., `bun install`).
|
||||||
|
* `template_update` (boolean, optional): If `true`, update the Docusaurus template.
|
||||||
|
* `atlas_dir` (string, optional): Directory where Atlas exports collections (used by the Atlas client in `lib/data/atlas/client`).
|
||||||
|
* `use_atlas` (boolean, optional): If `true`, use the Atlas client as the content backend (default behavior).
|
||||||
|
|
||||||
### 2.2. Adding a Docusaurus Site (`docusaurus.add`)
|
### 2.2. Adding a Docusaurus Site (`docusaurus.add`)
|
||||||
|
|
||||||
@@ -53,7 +61,7 @@ The `docusaurus.add` directive defines an individual Docusaurus site (your ebook
|
|||||||
```heroscript
|
```heroscript
|
||||||
!!docusaurus.add
|
!!docusaurus.add
|
||||||
name:"tfgrid_tech_ebook"
|
name:"tfgrid_tech_ebook"
|
||||||
git_url:"https://git.threefold.info/tfgrid/docs_tfgrid4/src/branch/main/ebooks/tech"
|
git_url:"https://git.ourworld.tf/tfgrid/docs_tfgrid4/src/branch/main/ebooks/tech"
|
||||||
git_reset:true // Reset Git repository before pulling
|
git_reset:true // Reset Git repository before pulling
|
||||||
git_pull:true // Pull latest changes
|
git_pull:true // Pull latest changes
|
||||||
git_root:"/tmp/git_clones" // Optional: specify a root directory for git clones
|
git_root:"/tmp/git_clones" // Optional: specify a root directory for git clones
|
||||||
@@ -190,18 +198,18 @@ Configure the footer section of your Docusaurus site.
|
|||||||
* `href` (string, optional): External URL for the link.
|
* `href` (string, optional): External URL for the link.
|
||||||
* `to` (string, optional): Internal Docusaurus path.
|
* `to` (string, optional): Internal Docusaurus path.
|
||||||
|
|
||||||
### 3.4. Build Destinations (`site.build_dest`, `site.build_dest_dev`)
|
### 3.4. Publish Destinations (`site.publish`, `site.publish_dev`)
|
||||||
|
|
||||||
Specify where the built Docusaurus site should be deployed. This typically involves an SSH connection defined elsewhere (e.g., `!!site.ssh_connection`).
|
Specify where the built Docusaurus site should be deployed. This typically involves an SSH connection defined elsewhere (e.g., `!!site.ssh_connection`).
|
||||||
|
|
||||||
**HeroScript Example:**
|
**HeroScript Example:**
|
||||||
|
|
||||||
```heroscript
|
```heroscript
|
||||||
!!site.build_dest
|
!!site.publish
|
||||||
ssh_name:"production_server" // Name of a pre-defined SSH connection
|
ssh_name:"production_server" // Name of a pre-defined SSH connection
|
||||||
path:"/var/www/my-ebook" // Remote path on the server
|
path:"/var/www/my-ebook" // Remote path on the server
|
||||||
|
|
||||||
!!site.build_dest_dev
|
!!site.publish_dev
|
||||||
ssh_name:"dev_server"
|
ssh_name:"dev_server"
|
||||||
path:"/tmp/dev-ebook"
|
path:"/tmp/dev-ebook"
|
||||||
```
|
```
|
||||||
@@ -219,7 +227,7 @@ This powerful feature allows you to pull markdown content and assets from other
|
|||||||
|
|
||||||
```heroscript
|
```heroscript
|
||||||
!!site.import
|
!!site.import
|
||||||
url:'https://git.threefold.info/tfgrid/docs_tfgrid4/src/branch/main/collections/cloud_reinvented'
|
url:'https://git.ourworld.tf/tfgrid/docs_tfgrid4/src/branch/main/collections/cloud_reinvented'
|
||||||
dest:'cloud_reinvented' // Destination subdirectory within your Docusaurus docs folder
|
dest:'cloud_reinvented' // Destination subdirectory within your Docusaurus docs folder
|
||||||
replace:'NAME:MyName, URGENCY:red' // Optional: comma-separated key:value pairs for text replacement
|
replace:'NAME:MyName, URGENCY:red' // Optional: comma-separated key:value pairs for text replacement
|
||||||
```
|
```
|
||||||
@@ -265,22 +273,29 @@ This is where you define the actual content pages and how they are organized int
|
|||||||
* `draft` (boolean, optional): If `true`, the page will be marked as a draft and not included in production builds.
|
* `draft` (boolean, optional): If `true`, the page will be marked as a draft and not included in production builds.
|
||||||
* `title_nr` (int, optional): If set, HeroLib will re-number the markdown headings (e.g., `title_nr:3` will make `# Heading` become `### Heading`). Useful for consistent heading levels across imported content.
|
* `title_nr` (int, optional): If set, HeroLib will re-number the markdown headings (e.g., `title_nr:3` will make `# Heading` become `### Heading`). Useful for consistent heading levels across imported content.
|
||||||
|
|
||||||
### 3.7. Doctree Integration Details
|
### 3.7. Collections and Atlas/Doctree Integration
|
||||||
|
|
||||||
The `site.page` directive's `src` parameter (`collection_name:page_name`) is the bridge to your Doctree content.
|
The `site.page` directive's `src` parameter (`collection_name:page_name`) is the bridge to your content collections.
|
||||||
|
|
||||||
**How Doctree Works:**
|
**Current default: Atlas export**
|
||||||
|
|
||||||
|
1. **Collections**: Atlas exports markdown files into collections under an `export_dir` (see `lib/data/atlas/client`).
|
||||||
|
2. **Export step**: A separate process (Atlas) writes the collections into `atlas_dir` (e.g., `/tmp/atlas_export`), following the `content/` + `meta/` structure.
|
||||||
|
3. **Docusaurus consumption**: The Docusaurus module uses the Atlas client (`atlas_client`) to resolve `collection_name:page_name` into markdown content and assets when generating docs.
|
||||||
|
|
||||||
|
**Alternative: Doctree/`doctreeclient`**
|
||||||
|
|
||||||
|
In older setups, or when explicitly configured, Doctree and `doctreeclient` can still be used to provide the same `collection:page` model:
|
||||||
|
|
||||||
1. **Collections**: Doctree organizes markdown files into logical groups called "collections." A collection is typically a directory containing markdown files and an empty `.collection` file.
|
1. **Collections**: Doctree organizes markdown files into logical groups called "collections." A collection is typically a directory containing markdown files and an empty `.collection` file.
|
||||||
2. **Scanning**: You define which collections Doctree should scan using `!!doctree.scan` in a HeroScript file (e.g., `doctree.heroscript`).
|
2. **Scanning**: You define which collections Doctree should scan using `!!doctree.scan` in a HeroScript file (e.g., `doctree.heroscript`):
|
||||||
**Example `doctree.heroscript`:**
|
|
||||||
|
|
||||||
```heroscript
|
```heroscript
|
||||||
!!doctree.scan git_url:"https://git.threefold.info/tfgrid/docs_tfgrid4/src/branch/main/collections"
|
!!doctree.scan git_url:"https://git.ourworld.tf/tfgrid/docs_tfgrid4/src/branch/main/collections"
|
||||||
```
|
```
|
||||||
|
|
||||||
This will pull the `collections` directory from the specified Git URL and make its contents available to Doctree.
|
This will pull the `collections` directory from the specified Git URL and make its contents available to Doctree.
|
||||||
3. **Page Retrieval**: When `site.page` references `src:"my_collection:my_page"`, HeroLib's `doctreeclient` fetches the content of `my_page.md` from the `my_collection` collection that Doctree has scanned.
|
3. **Page Retrieval**: When `site.page` references `src:"my_collection:my_page"`, the client (`atlas_client` or `doctreeclient`, depending on configuration) fetches the content of `my_page.md` from the `my_collection` collection.
|
||||||
|
|
||||||
## 4. Building and Developing Your Ebook
|
## 4. Building and Developing Your Ebook
|
||||||
|
|
||||||
|
|||||||
@@ -35,11 +35,11 @@ pub fn play(mut plbook PlayBook) ! {
|
|||||||
if plbook.exists_once(filter: 'docusaurus.define') {
|
if plbook.exists_once(filter: 'docusaurus.define') {
|
||||||
mut action := plbook.get(filter: 'docusaurus.define')!
|
mut action := plbook.get(filter: 'docusaurus.define')!
|
||||||
mut p := action.params
|
mut p := action.params
|
||||||
//example how we get parameters from the action see core_params.md for more details
|
//example how we get parameters from the action see aiprompts/herolib_core/core_params.md for more details
|
||||||
ds = new(
|
path_build := p.get_default('path_build', '')!
|
||||||
path: p.get_default('path_publish', '')!
|
path_publish := p.get_default('path_publish', '')!
|
||||||
production: p.get_default_false('production')
|
reset := p.get_default_false('reset')
|
||||||
)!
|
use_atlas := p.get_default_false('use_atlas')
|
||||||
}
|
}
|
||||||
|
|
||||||
// Process 'docusaurus.add' actions to configure individual Docusaurus sites
|
// Process 'docusaurus.add' actions to configure individual Docusaurus sites
|
||||||
@@ -51,4 +51,4 @@ pub fn play(mut plbook PlayBook) ! {
|
|||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
For detailed information on parameter retrieval methods (e.g., `p.get()`, `p.get_int()`, `p.get_default_true()`), refer to `aiprompts/ai_core/core_params.md`.
|
For detailed information on parameter retrieval methods (e.g., `p.get()`, `p.get_int()`, `p.get_default_true()`), refer to `aiprompts/herolib_core/core_params.md`.
|
||||||
|
|||||||
@@ -1,3 +1,5 @@
|
|||||||
|
> NOTE: This document is an example snapshot of a developer's filesystem layout for HeroDB/HeroModels. Paths under `/Users/despiegk/...` are illustrative only. For the current, authoritative structure always use the live repository tree (this checkout) and the modules under `lib/hero/heromodels` and `lib/hero/db`.
|
||||||
|
|
||||||
<file_map>
|
<file_map>
|
||||||
/Users/despiegk/code/github/incubaid/herolib
|
/Users/despiegk/code/github/incubaid/herolib
|
||||||
├── .github
|
├── .github
|
||||||
|
|||||||
15
aiprompts/instructions_archive/README.md
Normal file
15
aiprompts/instructions_archive/README.md
Normal file
@@ -0,0 +1,15 @@
|
|||||||
|
# Instructions Archive (Legacy Prompts)
|
||||||
|
|
||||||
|
This directory contains **archived / legacy AI prompt material** for `herolib`.
|
||||||
|
|
||||||
|
- Files here may describe **older workflows** (e.g. previous documentation generation or model pipelines).
|
||||||
|
- They are kept for **historical reference** and to help understand how things evolved.
|
||||||
|
- They are **not** guaranteed to match the current `herolib` implementation.
|
||||||
|
|
||||||
|
## Usage Guidelines
|
||||||
|
|
||||||
|
- Do **not** use these files as the primary source for new features or refactors.
|
||||||
|
- When generating code or documentation, prefer:
|
||||||
|
1. Code and module docs under `lib/` (e.g. `lib/web/site/ai_instructions.md`, `lib/web/docusaurus/README.md`).
|
||||||
|
2. Up-to-date AI instructions under `aiprompts/` (outside of `instructions_archive/`).
|
||||||
|
- Only consult this directory when you explicitly need to understand **historical behavior** or migrate old flows.
|
||||||
@@ -122,12 +122,12 @@ pub fn play(mut plbook PlayBook) ! {
|
|||||||
if plbook.exists_once(filter: 'docusaurus.define') {
|
if plbook.exists_once(filter: 'docusaurus.define') {
|
||||||
mut action := plbook.get(filter: 'docusaurus.define')!
|
mut action := plbook.get(filter: 'docusaurus.define')!
|
||||||
mut p := action.params
|
mut p := action.params
|
||||||
//example how we get parameters from the action see core_params.md for more details
|
//example how we get parameters from the action see aiprompts/herolib_core/core_params.md for more details
|
||||||
ds = new(
|
path_build := p.get_default('path_build', '')!
|
||||||
path: p.get_default('path_publish', '')!
|
path_publish := p.get_default('path_publish', '')!
|
||||||
production: p.get_default_false('production')
|
reset := p.get_default_false('reset')
|
||||||
)!
|
use_atlas := p.get_default_false('use_atlas')
|
||||||
}
|
}
|
||||||
|
|
||||||
// Process 'docusaurus.add' actions to configure individual Docusaurus sites
|
// Process 'docusaurus.add' actions to configure individual Docusaurus sites
|
||||||
actions := plbook.find(filter: 'docusaurus.add')!
|
actions := plbook.find(filter: 'docusaurus.add')!
|
||||||
@@ -138,7 +138,7 @@ pub fn play(mut plbook PlayBook) ! {
|
|||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
For detailed information on parameter retrieval methods (e.g., `p.get()`, `p.get_int()`, `p.get_default_true()`), refer to `aiprompts/ai_core/core_params.md`.
|
For detailed information on parameter retrieval methods (e.g., `p.get()`, `p.get_int()`, `p.get_default_true()`), refer to `aiprompts/herolib_core/core_params.md`.
|
||||||
|
|
||||||
# PlayBook, process heroscripts
|
# PlayBook, process heroscripts
|
||||||
|
|
||||||
|
|||||||
@@ -53,7 +53,7 @@ fn do() ! {
|
|||||||
mut cmd := Command{
|
mut cmd := Command{
|
||||||
name: 'hero'
|
name: 'hero'
|
||||||
description: 'Your HERO toolset.'
|
description: 'Your HERO toolset.'
|
||||||
version: '1.0.36'
|
version: '1.0.38'
|
||||||
}
|
}
|
||||||
|
|
||||||
mut toinstall := false
|
mut toinstall := false
|
||||||
@@ -103,4 +103,4 @@ fn main() {
|
|||||||
print_backtrace()
|
print_backtrace()
|
||||||
exit(1)
|
exit(1)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -11,8 +11,13 @@ pub fn parse_const(code_ string) !Const {
|
|||||||
if !code.contains('=') {
|
if !code.contains('=') {
|
||||||
return error('code <${code_}> is not of const')
|
return error('code <${code_}> is not of const')
|
||||||
}
|
}
|
||||||
|
mut name := code.split('=')[0].trim_space()
|
||||||
|
// Strip 'const ' prefix if present
|
||||||
|
if name.starts_with('const ') {
|
||||||
|
name = name.trim_string_left('const ').trim_space()
|
||||||
|
}
|
||||||
return Const{
|
return Const{
|
||||||
name: code.split('=')[0].trim_space()
|
name: name
|
||||||
value: code.split('=')[1].trim_space()
|
value: code.split('=')[1].trim_space()
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -44,6 +44,11 @@ pub fn (p Param) typescript() string {
|
|||||||
pub fn parse_param(code_ string) !Param {
|
pub fn parse_param(code_ string) !Param {
|
||||||
mut code := code_.trim_space()
|
mut code := code_.trim_space()
|
||||||
|
|
||||||
|
// Handle empty string (void return type)
|
||||||
|
if code == '' {
|
||||||
|
return Param{}
|
||||||
|
}
|
||||||
|
|
||||||
if code == '!' {
|
if code == '!' {
|
||||||
return Param{
|
return Param{
|
||||||
is_result: true
|
is_result: true
|
||||||
@@ -60,6 +65,13 @@ pub fn parse_param(code_ string) !Param {
|
|||||||
}
|
}
|
||||||
split := code.split(' ').filter(it != '')
|
split := code.split(' ').filter(it != '')
|
||||||
|
|
||||||
|
// Handle empty split (void return type after mut check)
|
||||||
|
if split.len == 0 {
|
||||||
|
return Param{
|
||||||
|
mutable: is_mut
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
if split.len == 1 {
|
if split.len == 1 {
|
||||||
// means anonymous param
|
// means anonymous param
|
||||||
return Param{
|
return Param{
|
||||||
|
|||||||
@@ -15,14 +15,14 @@ fn test_comprehensive_code_parsing() {
|
|||||||
console.print_lf(1)
|
console.print_lf(1)
|
||||||
|
|
||||||
// Run all tests
|
// Run all tests
|
||||||
test_module_parsing()
|
check_module_parsing()!
|
||||||
test_struct_parsing()
|
check_struct_parsing()
|
||||||
test_function_parsing()
|
check_function_parsing()!
|
||||||
test_imports_and_modules()
|
check_imports_and_modules()
|
||||||
test_type_system()
|
check_type_system()
|
||||||
test_visibility_modifiers()
|
check_visibility_modifiers()
|
||||||
test_method_parsing()
|
check_method_parsing()!
|
||||||
test_constants_parsing()
|
check_constants_parsing()
|
||||||
|
|
||||||
console.print_green('✓ All comprehensive tests passed!')
|
console.print_green('✓ All comprehensive tests passed!')
|
||||||
console.print_lf(1)
|
console.print_lf(1)
|
||||||
@@ -74,7 +74,7 @@ fn copy_directory(src string, dst string) ! {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
fn test_module_parsing() {
|
fn check_module_parsing() ! {
|
||||||
console.print_header('Test 1: Module and File Parsing')
|
console.print_header('Test 1: Module and File Parsing')
|
||||||
|
|
||||||
mut myparser := new(path: '/tmp/codeparsertest', recursive: true)!
|
mut myparser := new(path: '/tmp/codeparsertest', recursive: true)!
|
||||||
@@ -98,7 +98,7 @@ fn test_module_parsing() {
|
|||||||
console.print_lf(1)
|
console.print_lf(1)
|
||||||
}
|
}
|
||||||
|
|
||||||
fn test_struct_parsing() {
|
fn check_struct_parsing() {
|
||||||
console.print_header('Test 2: Struct Parsing')
|
console.print_header('Test 2: Struct Parsing')
|
||||||
|
|
||||||
models_file := os.join_path('/tmp/codeparsertest', 'models.v')
|
models_file := os.join_path('/tmp/codeparsertest', 'models.v')
|
||||||
@@ -145,7 +145,7 @@ fn test_struct_parsing() {
|
|||||||
console.print_lf(1)
|
console.print_lf(1)
|
||||||
}
|
}
|
||||||
|
|
||||||
fn test_function_parsing() {
|
fn check_function_parsing() ! {
|
||||||
console.print_header('Test 3: Function Parsing')
|
console.print_header('Test 3: Function Parsing')
|
||||||
|
|
||||||
mut myparser := new(path: '/tmp/codeparsertest', recursive: true)!
|
mut myparser := new(path: '/tmp/codeparsertest', recursive: true)!
|
||||||
@@ -191,7 +191,7 @@ fn test_function_parsing() {
|
|||||||
console.print_lf(1)
|
console.print_lf(1)
|
||||||
}
|
}
|
||||||
|
|
||||||
fn test_imports_and_modules() {
|
fn check_imports_and_modules() {
|
||||||
console.print_header('Test 4: Imports and Module Names')
|
console.print_header('Test 4: Imports and Module Names')
|
||||||
|
|
||||||
models_file := os.join_path('/tmp/codeparsertest', 'models.v')
|
models_file := os.join_path('/tmp/codeparsertest', 'models.v')
|
||||||
@@ -222,7 +222,7 @@ fn test_imports_and_modules() {
|
|||||||
console.print_lf(1)
|
console.print_lf(1)
|
||||||
}
|
}
|
||||||
|
|
||||||
fn test_type_system() {
|
fn check_type_system() {
|
||||||
console.print_header('Test 5: Type System')
|
console.print_header('Test 5: Type System')
|
||||||
|
|
||||||
models_file := os.join_path('/tmp/codeparsertest', 'models.v')
|
models_file := os.join_path('/tmp/codeparsertest', 'models.v')
|
||||||
@@ -257,7 +257,7 @@ fn test_type_system() {
|
|||||||
console.print_lf(1)
|
console.print_lf(1)
|
||||||
}
|
}
|
||||||
|
|
||||||
fn test_visibility_modifiers() {
|
fn check_visibility_modifiers() {
|
||||||
console.print_header('Test 6: Visibility Modifiers')
|
console.print_header('Test 6: Visibility Modifiers')
|
||||||
|
|
||||||
models_file := os.join_path('/tmp/codeparsertest', 'models.v')
|
models_file := os.join_path('/tmp/codeparsertest', 'models.v')
|
||||||
@@ -293,7 +293,7 @@ fn test_visibility_modifiers() {
|
|||||||
console.print_lf(1)
|
console.print_lf(1)
|
||||||
}
|
}
|
||||||
|
|
||||||
fn test_method_parsing() {
|
fn check_method_parsing() ! {
|
||||||
console.print_header('Test 7: Method Parsing')
|
console.print_header('Test 7: Method Parsing')
|
||||||
|
|
||||||
mut myparser := new(path: '/tmp/codeparsertest', recursive: true)!
|
mut myparser := new(path: '/tmp/codeparsertest', recursive: true)!
|
||||||
@@ -327,7 +327,7 @@ fn test_method_parsing() {
|
|||||||
console.print_lf(1)
|
console.print_lf(1)
|
||||||
}
|
}
|
||||||
|
|
||||||
fn test_constants_parsing() {
|
fn check_constants_parsing() {
|
||||||
console.print_header('Test 8: Constants Parsing')
|
console.print_header('Test 8: Constants Parsing')
|
||||||
|
|
||||||
models_file := os.join_path('/tmp/codeparsertest', 'models.v')
|
models_file := os.join_path('/tmp/codeparsertest', 'models.v')
|
||||||
|
|||||||
@@ -149,19 +149,19 @@ pub fn (mut config CrunConfig) set_hostname(hostname string) &CrunConfig {
|
|||||||
}
|
}
|
||||||
|
|
||||||
pub fn (mut config CrunConfig) set_memory_limit(limit_bytes u64) &CrunConfig {
|
pub fn (mut config CrunConfig) set_memory_limit(limit_bytes u64) &CrunConfig {
|
||||||
config.spec.linux.resources.memory.limit = limit_bytes
|
config.spec.linux_config.resources.memory.limit = limit_bytes
|
||||||
return config
|
return config
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn (mut config CrunConfig) set_cpu_limits(period u64, quota i64, shares u64) &CrunConfig {
|
pub fn (mut config CrunConfig) set_cpu_limits(period u64, quota i64, shares u64) &CrunConfig {
|
||||||
config.spec.linux.resources.cpu.period = period
|
config.spec.linux_config.resources.cpu.period = period
|
||||||
config.spec.linux.resources.cpu.quota = quota
|
config.spec.linux_config.resources.cpu.quota = quota
|
||||||
config.spec.linux.resources.cpu.shares = shares
|
config.spec.linux_config.resources.cpu.shares = shares
|
||||||
return config
|
return config
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn (mut config CrunConfig) set_pids_limit(limit i64) &CrunConfig {
|
pub fn (mut config CrunConfig) set_pids_limit(limit i64) &CrunConfig {
|
||||||
config.spec.linux.resources.pids.limit = limit
|
config.spec.linux_config.resources.pids.limit = limit
|
||||||
return config
|
return config
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -222,15 +222,15 @@ pub fn (mut config CrunConfig) set_terminal(value bool) &CrunConfig {
|
|||||||
}
|
}
|
||||||
|
|
||||||
pub fn (mut config CrunConfig) add_masked_path(path string) &CrunConfig {
|
pub fn (mut config CrunConfig) add_masked_path(path string) &CrunConfig {
|
||||||
if path !in config.spec.linux.masked_paths {
|
if path !in config.spec.linux_config.masked_paths {
|
||||||
config.spec.linux.masked_paths << path
|
config.spec.linux_config.masked_paths << path
|
||||||
}
|
}
|
||||||
return config
|
return config
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn (mut config CrunConfig) add_readonly_path(path string) &CrunConfig {
|
pub fn (mut config CrunConfig) add_readonly_path(path string) &CrunConfig {
|
||||||
if path !in config.spec.linux.readonly_paths {
|
if path !in config.spec.linux_config.readonly_paths {
|
||||||
config.spec.linux.readonly_paths << path
|
config.spec.linux_config.readonly_paths << path
|
||||||
}
|
}
|
||||||
return config
|
return config
|
||||||
}
|
}
|
||||||
@@ -293,7 +293,7 @@ fn create_default_spec() Spec {
|
|||||||
}
|
}
|
||||||
hostname: 'container'
|
hostname: 'container'
|
||||||
mounts: create_default_mounts()
|
mounts: create_default_mounts()
|
||||||
linux: Linux{
|
linux_config: LinuxConfig{
|
||||||
namespaces: create_default_namespaces()
|
namespaces: create_default_namespaces()
|
||||||
masked_paths: [
|
masked_paths: [
|
||||||
'/proc/acpi',
|
'/proc/acpi',
|
||||||
|
|||||||
@@ -3,14 +3,14 @@ module crun
|
|||||||
// OCI Runtime Spec structures that can be directly encoded to JSON
|
// OCI Runtime Spec structures that can be directly encoded to JSON
|
||||||
pub struct Spec {
|
pub struct Spec {
|
||||||
pub mut:
|
pub mut:
|
||||||
oci_version string @[json: 'ociVersion']
|
oci_version string @[json: 'ociVersion']
|
||||||
platform Platform
|
platform Platform
|
||||||
process Process
|
process Process
|
||||||
root Root
|
root Root
|
||||||
hostname string
|
hostname string
|
||||||
mounts []Mount
|
mounts []Mount
|
||||||
linux Linux
|
linux_config LinuxConfig
|
||||||
hooks Hooks
|
hooks Hooks
|
||||||
}
|
}
|
||||||
|
|
||||||
pub struct Platform {
|
pub struct Platform {
|
||||||
@@ -68,7 +68,7 @@ pub mut:
|
|||||||
options []string
|
options []string
|
||||||
}
|
}
|
||||||
|
|
||||||
pub struct Linux {
|
pub struct LinuxConfig {
|
||||||
pub mut:
|
pub mut:
|
||||||
namespaces []LinuxNamespace
|
namespaces []LinuxNamespace
|
||||||
resources LinuxResources
|
resources LinuxResources
|
||||||
|
|||||||
@@ -88,7 +88,7 @@ os.write_file(hero_v_path, lines.join_lines()) or {
|
|||||||
os.rm('${hero_v_path}.backup') or { eprintln('Warning: Could not remove backup file: ${err}') }
|
os.rm('${hero_v_path}.backup') or { eprintln('Warning: Could not remove backup file: ${err}') }
|
||||||
|
|
||||||
// Update version in install_hero.sh
|
// Update version in install_hero.sh
|
||||||
install_hero_path := '${ourdir}/install_hero.sh'
|
install_hero_path := '${ourdir}/scripts/install_hero.sh'
|
||||||
install_hero_content := os.read_file(install_hero_path) or {
|
install_hero_content := os.read_file(install_hero_path) or {
|
||||||
eprintln('Error reading ${install_hero_path}: ${err}')
|
eprintln('Error reading ${install_hero_path}: ${err}')
|
||||||
exit(1)
|
exit(1)
|
||||||
|
|||||||
@@ -4,7 +4,7 @@ set -e
|
|||||||
|
|
||||||
os_name="$(uname -s)"
|
os_name="$(uname -s)"
|
||||||
arch_name="$(uname -m)"
|
arch_name="$(uname -m)"
|
||||||
version='1.0.36'
|
version='1.0.38'
|
||||||
|
|
||||||
# Detect Linux distribution type
|
# Detect Linux distribution type
|
||||||
linux_type=""
|
linux_type=""
|
||||||
@@ -17,19 +17,11 @@ fi
|
|||||||
# Base URL for GitHub releases
|
# Base URL for GitHub releases
|
||||||
base_url="https://github.com/incubaid/herolib/releases/download/v${version}"
|
base_url="https://github.com/incubaid/herolib/releases/download/v${version}"
|
||||||
|
|
||||||
# Select the URL based on the platform
|
# Select the URL based on the platform. For Linux we have a single static binary
|
||||||
if [[ "$os_name" == "Linux" && "$arch_name" == "x86_64" ]]; then
|
if [[ "$os_name" == "Linux" && "$arch_name" == "x86_64" ]]; then
|
||||||
if [[ "$linux_type" == "alpine" ]]; then
|
url="$base_url/hero-x86_64-linux-musl"
|
||||||
url="$base_url/hero-x86_64-linux-musl"
|
|
||||||
else
|
|
||||||
url="$base_url/hero-x86_64-linux"
|
|
||||||
fi
|
|
||||||
elif [[ "$os_name" == "Linux" && "$arch_name" == "aarch64" ]]; then
|
elif [[ "$os_name" == "Linux" && "$arch_name" == "aarch64" ]]; then
|
||||||
if [[ "$linux_type" == "alpine" ]]; then
|
url="$base_url/hero-aarch64-linux-musl"
|
||||||
url="$base_url/hero-aarch64-linux-musl"
|
|
||||||
else
|
|
||||||
url="$base_url/hero-aarch64-linux"
|
|
||||||
fi
|
|
||||||
elif [[ "$os_name" == "Darwin" && "$arch_name" == "arm64" ]]; then
|
elif [[ "$os_name" == "Darwin" && "$arch_name" == "arm64" ]]; then
|
||||||
url="$base_url/hero-aarch64-apple-darwin"
|
url="$base_url/hero-aarch64-apple-darwin"
|
||||||
# elif [[ "$os_name" == "Darwin" && "$arch_name" == "x86_64" ]]; then
|
# elif [[ "$os_name" == "Darwin" && "$arch_name" == "x86_64" ]]; then
|
||||||
@@ -146,4 +138,4 @@ if [ "$file_size" -ge 2 ]; then
|
|||||||
else
|
else
|
||||||
echo "Downloaded file is less than 2 MB. Process aborted."
|
echo "Downloaded file is less than 2 MB. Process aborted."
|
||||||
exit 1
|
exit 1
|
||||||
fi
|
fi
|
||||||
Reference in New Issue
Block a user