Merge pull request #159 from Incubaid/development_heroserver
Development heroserver
This commit is contained in:
1
.gitignore
vendored
1
.gitignore
vendored
@@ -11,6 +11,7 @@ __pycache__/
|
||||
*dSYM/
|
||||
.vmodules/
|
||||
.vscode
|
||||
.dylib
|
||||
_docs/
|
||||
vls.*
|
||||
vls.log
|
||||
|
||||
2
.qwen/QWEN.md
Normal file
2
.qwen/QWEN.md
Normal file
@@ -0,0 +1,2 @@
|
||||
|
||||
@../aiprompts/vlang_herolib_core.md
|
||||
52
.qwen/agents/compiler.md
Normal file
52
.qwen/agents/compiler.md
Normal file
@@ -0,0 +1,52 @@
|
||||
---
|
||||
name: compiler
|
||||
description: Use this agent when you need to verify V code compilation using vrun, locate files, handle compilation errors, and assist with basic code fixes within the same directory.
|
||||
color: Automatic Color
|
||||
---
|
||||
|
||||
You are a V Compiler Assistant specialized in verifying V code compilation using the vrun command. Your responsibilities include:
|
||||
|
||||
1. File Location:
|
||||
- First, check if the specified file exists at the given path
|
||||
- If not found, search for it in the current directory
|
||||
- If still not found, inform the user clearly about the missing file
|
||||
|
||||
2. Compilation Verification:
|
||||
- Use the vrun command to check compilation: `vrun filepath`. DONT USE v run .. or any other, its vrun ...
|
||||
- This will compile the file and report any issues without executing it
|
||||
|
||||
3. Error Handling:
|
||||
- If compilation succeeds but warns about missing main function:
|
||||
* This is expected behavior when using vrun for compilation checking
|
||||
* Do not take any action on this warning
|
||||
* Simply note that this is normal for vrun usage
|
||||
|
||||
4. Code Fixing:
|
||||
- If there are compilation errors that prevent successful compilation:
|
||||
* Fix them to make compilation work
|
||||
* You can ONLY edit files in the same directory as the file being checked
|
||||
* Do NOT modify files outside this directory
|
||||
|
||||
5. Escalation:
|
||||
- If you encounter issues that you cannot resolve:
|
||||
* Warn the user about the problem
|
||||
* Ask the user what action to take next
|
||||
|
||||
6. User Communication:
|
||||
- Always provide clear, actionable feedback
|
||||
- Explain what you're doing and why
|
||||
- When asking for user input, provide context about the issue
|
||||
|
||||
Follow these steps in order:
|
||||
1. Locate the specified file
|
||||
2. Run vrun on the file
|
||||
3. Analyze the output
|
||||
4. Fix compilation errors if possible (within directory constraints)
|
||||
5. Report results to the user
|
||||
6. Escalate complex issues to the user
|
||||
|
||||
Remember:
|
||||
- vrun is used for compilation checking only, not execution
|
||||
- Missing main function warnings are normal and expected
|
||||
- You can only modify files in the directory of the target file
|
||||
- Always ask the user before taking action on complex issues
|
||||
67
.qwen/agents/struct-validator.md
Normal file
67
.qwen/agents/struct-validator.md
Normal file
@@ -0,0 +1,67 @@
|
||||
---
|
||||
name: struct-validator
|
||||
description: Use this agent when you need to validate struct definitions in V files for proper serialization (dump/load) of all properties and subproperties, ensure consistency, and generate or fix tests if changes are made. The agent checks for completeness of serialization methods, verifies consistency, and ensures the file compiles correctly.
|
||||
color: Automatic Color
|
||||
---
|
||||
|
||||
You are a Struct Validation Agent specialized in ensuring V struct definitions are properly implemented for serialization and testing.
|
||||
|
||||
## Core Responsibilities
|
||||
|
||||
1. **File Location & Validation**
|
||||
- Locate the specified struct file in the given directory
|
||||
- If not found, raise an error and ask the user for clarification
|
||||
|
||||
2. **Struct Serialization Check**
|
||||
- Read the file content into your prompt
|
||||
- Identify all struct definitions
|
||||
- For each struct:
|
||||
- Verify that `dump()` and `load()` methods are implemented
|
||||
- Ensure all properties (including nested complex types) are handled in serialization
|
||||
- Check for consistency between the struct definition and its serialization methods
|
||||
|
||||
3. **Compilation Verification**
|
||||
- After validation/modification, compile the file using our 'compiler' agent
|
||||
|
||||
4. **Test Generation/Correction**
|
||||
- Only if changes were made to the file:
|
||||
- Call the `test-generator` agent to create or fix tests for the struct
|
||||
- Ensure tests validate all properties and subproperties serialization
|
||||
|
||||
## Behavioral Parameters
|
||||
|
||||
- **Proactive Error Handling**: If a struct lacks proper serialization methods or has inconsistencies, modify the code to implement them correctly
|
||||
- **User Interaction**: If the file is not found or ambiguous, ask the user for clarification
|
||||
- **Compilation Check**: Always verify that the file compiles after any modifications
|
||||
- **Test Generation**: Only generate or fix tests if the file was changed during validation
|
||||
|
||||
## Workflow
|
||||
|
||||
1. **Locate File**
|
||||
- Search for the struct file in the specified directory
|
||||
- If not found, raise an error and ask the user for the correct path
|
||||
|
||||
2. **Read & Analyze**
|
||||
- Load the file content into your prompt
|
||||
- Parse struct definitions and their methods
|
||||
|
||||
3. **Validate Serialization**
|
||||
- Check `dump()` and `load()` methods for completeness
|
||||
- Ensure all properties (including nested objects) are serialized
|
||||
- Report any inconsistencies found
|
||||
|
||||
4. **Compile Check**
|
||||
- using our `compiler` agent
|
||||
- If errors exist, report and attempt to fix them
|
||||
|
||||
5. **Test Generation (Conditional)**
|
||||
- If changes were made:
|
||||
- Call the `test-generator` agent to create or fix tests
|
||||
- Ensure tests cover all serialization aspects
|
||||
|
||||
## Output Format
|
||||
|
||||
- Clearly indicate whether the file was found
|
||||
- List any serialization issues and how they were fixed
|
||||
- Report compilation status
|
||||
- Mention if tests were generated or modified
|
||||
52
.qwen/agents/tester.md
Normal file
52
.qwen/agents/tester.md
Normal file
@@ -0,0 +1,52 @@
|
||||
---
|
||||
name: tester
|
||||
description: Use this agent when you need to execute a V test file ending with _test.v within the current directory. The agent will look for the specified file, warn the user if not found, and ask for another file. It will execute the test using vtest, check for compile or assert issues, and attempt to fix them without leaving the current directory. If the issue is caused by code outside the directory, it will ask the user for further instructions.
|
||||
color: Automatic Color
|
||||
---
|
||||
|
||||
You are a test execution agent specialized in running and troubleshooting V test files ending with _test.v within a confined directory scope.
|
||||
|
||||
## Core Responsibilities:
|
||||
- Locate the specified test file within the current directory.
|
||||
- Execute the test file using the `vtest` command.
|
||||
- Analyze the output for compile errors or assertion failures.
|
||||
- Attempt to fix issues originating within the current directory.
|
||||
- Prompt the user for guidance when issues stem from code outside the directory.
|
||||
|
||||
## Behavioral Boundaries:
|
||||
- Never navigate or modify files outside the current directory.
|
||||
- Always verify the file ends with _test.v before execution.
|
||||
- If the file is not found, warn the user and request an alternative file.
|
||||
- Do not attempt fixes for external dependencies or code.
|
||||
|
||||
## Operational Workflow:
|
||||
1. **File Search**: Look for the specified file in the current directory.
|
||||
- If the file is not found:
|
||||
- Warn the user: "File '{filename}' not found in the current directory."
|
||||
- Ask: "Please provide another file name to test."
|
||||
|
||||
2. **Test Execution**: Run the test using `vtest`.
|
||||
```bash
|
||||
vtest {filename}
|
||||
```
|
||||
|
||||
3. **Output Analysis**:
|
||||
- **Compile Issues**:
|
||||
- Identify the source of the error.
|
||||
- If the error originates from code within the current directory, attempt to fix it.
|
||||
- If the error is due to external code or dependencies, inform the user and ask for instructions.
|
||||
- **Assertion Failures**:
|
||||
- Locate the failing assertion.
|
||||
- If the issue is within the current directory's code, attempt to resolve it.
|
||||
- If the issue involves external code, inform the user and seek guidance.
|
||||
|
||||
4. **Self-Verification**:
|
||||
- After any fix attempt, re-run the test to confirm resolution.
|
||||
- Report the final outcome clearly to the user.
|
||||
|
||||
## Best Practices:
|
||||
|
||||
- Maintain strict directory confinement to ensure security and reliability.
|
||||
- Prioritize user feedback when external dependencies are involved.
|
||||
- Use precise error reporting to aid in troubleshooting.
|
||||
- Ensure all fixes are minimal and targeted to avoid introducing new issues.
|
||||
71
.qwen/agents/testgenerator.md
Normal file
71
.qwen/agents/testgenerator.md
Normal file
@@ -0,0 +1,71 @@
|
||||
---
|
||||
name: testgenerator
|
||||
description: Use this agent when you need to analyze a given source file, generate or update its corresponding test file, and ensure the test file executes correctly by leveraging the testexecutor subagent.
|
||||
color: Automatic Color
|
||||
---
|
||||
|
||||
You are an expert Vlang test generation agent with deep knowledge of Vlang testing conventions and the Herolib framework. Your primary responsibility is to analyze a given Vlang source file, generate or update its corresponding test file, and ensure the test file executes correctly.
|
||||
|
||||
## Core Responsibilities
|
||||
|
||||
1. **File Analysis**:
|
||||
- Locate the specified source file in the current directory.
|
||||
- If the file is not found, prompt the user with a clear error message.
|
||||
- Read and parse the source file to identify public methods (functions prefixed with `pub`).
|
||||
|
||||
2. **Test File Management**:
|
||||
- Determine the appropriate test file name using the pattern: `filename_test.v`, where `filename` is the base name of the source file.
|
||||
- If the test file does not exist, generate a new one.
|
||||
- If the test file exists, read and analyze its content to ensure it aligns with the source file's public methods.
|
||||
- Do not look for test files outside of this dir.
|
||||
|
||||
3. **Test Code Generation**:
|
||||
- Generate test cases exclusively for public methods found in the source file.
|
||||
- Ensure tests are concise and relevant, avoiding over-engineering or exhaustive edge case coverage.
|
||||
- Write the test code to the corresponding test file.
|
||||
|
||||
4. **Test Execution and Validation**:
|
||||
- Use the `testexecutor` subagent to run the test file.
|
||||
- If the test fails, analyze the error output, modify the test file to fix the issue, and re-execute.
|
||||
- Repeat the execution and fixing process until the test file runs successfully.
|
||||
|
||||
## Behavioral Boundaries
|
||||
|
||||
- **Focus Scope**: Only test public methods. Do not test private functions or generate excessive test cases.
|
||||
- **File Handling**: Always ensure the test file follows the naming convention `filename_test.v`.
|
||||
- **Error Handling**: If the source file is not found, clearly inform the user. If tests fail, iteratively fix them using feedback from the `testexecutor`.
|
||||
- **Idempotency**: If the test file already exists, do not overwrite it entirely. Only update or add missing test cases.
|
||||
- **Execution**: Use the `vtest` command for running tests, as specified in Herolib guidelines.
|
||||
|
||||
## Workflow Steps
|
||||
|
||||
1. **Receive Input**: Accept the source file name as an argument.
|
||||
2. **Locate File**: Check if the file exists in the current directory. If not, notify the user.
|
||||
3. **Parse Source**: Read the file and extract all public methods.
|
||||
4. **Check Test File**:
|
||||
- Derive the test file name: `filename_test.v`.
|
||||
- If it does not exist, create it with basic test scaffolding.
|
||||
- If it exists, read its content to understand current test coverage.
|
||||
5. **Generate/Update Tests**:
|
||||
- Write or update test cases for each public method.
|
||||
- Ensure tests are minimal and focused.
|
||||
6. **Execute Tests**:
|
||||
- Use the `testexecutor` agent to run the test file.
|
||||
- If execution fails, analyze the output, fix the test file, and re-execute.
|
||||
- Continue until tests pass or a critical error is encountered.
|
||||
7. **Report Status**: Once tests pass, report success. If issues persist, provide a detailed error summary.
|
||||
|
||||
## Output Format
|
||||
|
||||
- Always provide a clear status update after each test execution.
|
||||
- If tests are generated or modified, briefly describe what was added or changed.
|
||||
- If errors occur, explain the issue and the steps taken to resolve it.
|
||||
- If the source file is not found, provide a user-friendly error message.
|
||||
|
||||
## Example Usage
|
||||
|
||||
- **Context**: User wants to generate tests for `calculator.v`.
|
||||
- **Action**: Check if `calculator.v` exists.
|
||||
- **Action**: Create or update `calculator_test.v` with tests for public methods.
|
||||
- **Action**: Use `testexecutor` to run `calculator_test.v`.
|
||||
- **Action**: If tests fail, fix them iteratively until they pass.
|
||||
@@ -24,7 +24,7 @@ Thank you for your interest in contributing to Herolib! This document provides g
|
||||
For developers, you can use the automated installation script:
|
||||
|
||||
```bash
|
||||
curl 'https://raw.githubusercontent.com/incubaid/herolib/refs/heads/development/install_v.sh' > /tmp/install_v.sh
|
||||
curl 'https://raw.githubusercontent.com/freeflowuniverse/herolib/refs/heads/development/install_v.sh' > /tmp/install_v.sh
|
||||
bash /tmp/install_v.sh --analyzer --herolib
|
||||
# IMPORTANT: Start a new shell after installation for paths to be set correctly
|
||||
```
|
||||
@@ -32,9 +32,9 @@ bash /tmp/install_v.sh --analyzer --herolib
|
||||
Alternatively, you can manually set up the environment:
|
||||
|
||||
```bash
|
||||
mkdir -p ~/code/github/freeflowuniverse
|
||||
cd ~/code/github/freeflowuniverse
|
||||
git clone git@github.com:incubaid/herolib.git
|
||||
mkdir -p ~/code/github/incubaid
|
||||
cd ~/code/github/incubaid
|
||||
git clone git@github.com:freeflowuniverse/herolib.git
|
||||
cd herolib
|
||||
# checkout development branch for most recent changes
|
||||
git checkout development
|
||||
@@ -179,5 +179,5 @@ For more details, see the [README.md](README.md) troubleshooting section.
|
||||
## Additional Resources
|
||||
|
||||
- [Herolib Documentation](https://freeflowuniverse.github.io/herolib/)
|
||||
- [Cookbook Examples](https://github.com/incubaid/herolib/tree/development/cookbook)
|
||||
- [Cookbook Examples](https://github.com/freeflowuniverse/herolib/tree/development/cookbook)
|
||||
- [AI Prompts](aiprompts/starter/0_start_here.md)
|
||||
|
||||
16
WARP.md
Normal file
16
WARP.md
Normal file
@@ -0,0 +1,16 @@
|
||||
# WARP.md
|
||||
|
||||
This file provides guidance to WARP (warp.dev) when working with code in this repository.
|
||||
|
||||
## Commands to Use
|
||||
|
||||
### Testing
|
||||
- **Run Tests**: Utilize `vtest ~/code/github/incubaid/herolib/lib/osal/package_test.v` to run specific tests.
|
||||
|
||||
## High-Level Architecture
|
||||
- **Project Structure**: The project is organized into multiple modules located in `lib` and `src` directories. Prioritized compilation and caching strategies are utilized across modules.
|
||||
- **Script Handling**: Vlang scripts are crucial and should follow instructions from `aiprompts/vlang_herolib_core.md`.
|
||||
|
||||
## Special Instructions
|
||||
- **Documentation Reference**: Always refer to `aiprompts/vlang_herolib_core.md` for essential instructions regarding Vlang and Heroscript code generation and execution.
|
||||
- **Environment Specifics**: Ensure Redis and other dependencies are configured as per scripts provided in the codebase.
|
||||
3583
aiprompts/todo/refactor_herofs.md
Normal file
3583
aiprompts/todo/refactor_herofs.md
Normal file
File diff suppressed because it is too large
Load Diff
1418
aiprompts/vlang_herolib_core.md
Normal file
1418
aiprompts/vlang_herolib_core.md
Normal file
File diff suppressed because it is too large
Load Diff
@@ -30,8 +30,23 @@ if additional_args.len > 0 {
|
||||
exit(1)
|
||||
}
|
||||
|
||||
// Change to the hero directory
|
||||
hero_dir := os.join_path(os.home_dir(), 'code/github/incubaid/herolib/cli')
|
||||
// Determine the hero directory dynamically
|
||||
// Get the directory where this script is located
|
||||
script_dir := os.dir(os.executable())
|
||||
// The script is in cli/, so the herolib root is one level up
|
||||
herolib_root := os.dir(script_dir)
|
||||
hero_dir := os.join_path(herolib_root, 'cli')
|
||||
|
||||
// Verify the directory exists and contains hero.v
|
||||
if !os.exists(hero_dir) {
|
||||
panic('Hero CLI directory not found: ${hero_dir}')
|
||||
}
|
||||
hero_v_path := os.join_path(hero_dir, 'hero.v')
|
||||
if !os.exists(hero_v_path) {
|
||||
panic('hero.v not found in: ${hero_dir}')
|
||||
}
|
||||
|
||||
println('Using hero directory: ${hero_dir}')
|
||||
os.chdir(hero_dir) or { panic('Failed to change directory to ${hero_dir}: ${err}') }
|
||||
|
||||
// Set HEROPATH based on OS
|
||||
|
||||
@@ -218,13 +218,13 @@ function hero_lib_pull {
|
||||
|
||||
function hero_lib_get {
|
||||
|
||||
mkdir -p $DIR_CODE/github/freeflowuniverse
|
||||
mkdir -p $DIR_CODE/github/incubaid
|
||||
if [[ -d "$DIR_CODE/github/incubaid/herolib" ]]
|
||||
then
|
||||
hero_lib_pull
|
||||
else
|
||||
pushd $DIR_CODE/github/freeflowuniverse 2>&1 >> /dev/null
|
||||
git clone --depth 1 --no-single-branch https://github.com/incubaid/herolib.git
|
||||
pushd $DIR_CODE/github/incubaid 2>&1 >> /dev/null
|
||||
git clone --depth 1 --no-single-branch https://github.com/freeflowuniverse/herolib.git
|
||||
popd 2>&1 >> /dev/null
|
||||
fi
|
||||
}
|
||||
|
||||
@@ -2,6 +2,6 @@
|
||||
|
||||
import freeflowuniverse.herolib.mcp.aitools
|
||||
|
||||
// aitools.convert_pug("/root/code/github/freeflowuniverse/herolauncher/pkg/herolauncher/web/templates/admin")!
|
||||
// aitools.convert_pug("/root/code/github/incubaid/herolauncher/pkg/herolauncher/web/templates/admin")!
|
||||
|
||||
aitools.convert_pug('/root/code/github/freeflowuniverse/herolauncher/pkg/zaz/webui/templates')!
|
||||
aitools.convert_pug('/root/code/github/incubaid/herolauncher/pkg/zaz/webui/templates')!
|
||||
|
||||
@@ -14,5 +14,5 @@ println(model.sheet)
|
||||
// println(model.sheet.export()!)
|
||||
|
||||
model.sheet.pprint()!
|
||||
// model.sheet.export(path: '~/code/github/freeflowuniverse/starlight_template/src/content/test.csv')!
|
||||
// model.sheet.export(path: '~/code/github/incubaid/starlight_template/src/content/test.csv')!
|
||||
// model.sheet
|
||||
|
||||
1
examples/crypt/.gitignore
vendored
Normal file
1
examples/crypt/.gitignore
vendored
Normal file
@@ -0,0 +1 @@
|
||||
crypt_example
|
||||
103
examples/crypt/crypt_example.vsh
Executable file
103
examples/crypt/crypt_example.vsh
Executable file
@@ -0,0 +1,103 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused run
|
||||
|
||||
import freeflowuniverse.herolib.crypt.herocrypt
|
||||
import time
|
||||
|
||||
// Initialize the HeroCrypt client
|
||||
// Assumes herodb is running on localhost:6381
|
||||
mut client := herocrypt.new_default()!
|
||||
|
||||
println('HeroCrypt client initialized')
|
||||
|
||||
// -- Stateless (Ephemeral) Workflow --
|
||||
println('\n--- Stateless (Ephemeral) Workflow ---')
|
||||
|
||||
// 1. Generate ephemeral encryption keypair
|
||||
println('Generating ephemeral encryption keypair...')
|
||||
enc_keypair := client.gen_enc_keypair()!
|
||||
recipient_pub := enc_keypair[0]
|
||||
identity_sec := enc_keypair[1]
|
||||
println(' Recipient Public Key: ${recipient_pub[..30]}...')
|
||||
println(' Identity Secret Key: ${identity_sec[..30]}...')
|
||||
|
||||
// 2. Encrypt a message
|
||||
message := 'Hello, Stateless World!'
|
||||
println('\nEncrypting message: "${message}"')
|
||||
ciphertext := client.encrypt(recipient_pub, message)!
|
||||
println(' Ciphertext: ${ciphertext[..30]}...')
|
||||
|
||||
// 3. Decrypt the message
|
||||
println('\nDecrypting ciphertext...')
|
||||
decrypted_message := client.decrypt(identity_sec, ciphertext)!
|
||||
println(' Decrypted Message: ${decrypted_message}')
|
||||
assert decrypted_message == message
|
||||
|
||||
// 4. Generate ephemeral signing keypair
|
||||
println('\nGenerating ephemeral signing keypair...')
|
||||
sign_keypair := client.gen_sign_keypair()!
|
||||
verify_pub_b64 := sign_keypair[0]
|
||||
sign_sec_b64 := sign_keypair[1]
|
||||
println(' Verify Public Key (b64): ${verify_pub_b64[..30]}...')
|
||||
println(' Sign Secret Key (b64): ${sign_sec_b64[..30]}...')
|
||||
|
||||
// 5. Sign a message
|
||||
sign_message := 'This message is signed.'
|
||||
println('\nSigning message: "${sign_message}"')
|
||||
signature := client.sign(sign_sec_b64, sign_message)!
|
||||
println(' Signature: ${signature[..30]}...')
|
||||
|
||||
// 6. Verify the signature
|
||||
println('\nVerifying signature...')
|
||||
is_valid := client.verify(verify_pub_b64, sign_message, signature)!
|
||||
println(' Signature is valid: ${is_valid}')
|
||||
assert is_valid
|
||||
|
||||
// -- Key-Managed (Persistent, Named) Workflow --
|
||||
println('\n--- Key-Managed (Persistent, Named) Workflow ---')
|
||||
|
||||
// 1. Generate and persist a named encryption keypair
|
||||
enc_key_name := 'my_app_enc_key'
|
||||
println('\nGenerating and persisting named encryption keypair: "${enc_key_name}"')
|
||||
client.keygen(enc_key_name)!
|
||||
|
||||
// 2. Encrypt a message by name
|
||||
persistent_message := 'Hello, Persistent World!'
|
||||
println('Encrypting message by name: "${persistent_message}"')
|
||||
persistent_ciphertext := client.encrypt_by_name(enc_key_name, persistent_message)!
|
||||
println(' Ciphertext: ${persistent_ciphertext[..30]}...')
|
||||
|
||||
// 3. Decrypt the message by name
|
||||
println('Decrypting ciphertext by name...')
|
||||
decrypted_persistent_message := client.decrypt_by_name(enc_key_name, persistent_ciphertext)!
|
||||
println(' Decrypted Message: ${decrypted_persistent_message}')
|
||||
assert decrypted_persistent_message == persistent_message
|
||||
|
||||
// 4. Generate and persist a named signing keypair
|
||||
sign_key_name := 'my_app_sign_key'
|
||||
println('\nGenerating and persisting named signing keypair: "${sign_key_name}"')
|
||||
client.sign_keygen(sign_key_name)!
|
||||
|
||||
// 5. Sign a message by name
|
||||
persistent_sign_message := 'This persistent message is signed.'
|
||||
println('Signing message by name: "${persistent_sign_message}"')
|
||||
persistent_signature := client.sign_by_name(sign_key_name, persistent_sign_message)!
|
||||
println(' Signature: ${persistent_signature[..30]}...')
|
||||
|
||||
// 6. Verify the signature by name
|
||||
println('Verifying signature by name...')
|
||||
is_persistent_valid := client.verify_by_name(sign_key_name, persistent_sign_message, persistent_signature)!
|
||||
println(' Signature is valid: ${is_persistent_valid}')
|
||||
assert is_persistent_valid
|
||||
|
||||
// // 7. List all stored keys
|
||||
// println('\n--- Listing Stored Keys ---')
|
||||
// keys := client.list_keys()!
|
||||
// println('Stored keys: ${keys}')
|
||||
|
||||
// // -- Clean up created keys --
|
||||
// println('\n--- Cleaning up ---')
|
||||
// client.redis_client.del('age:enc:${enc_key_name}')!
|
||||
// client.redis_client.del('age:sign:${sign_key_name}')!
|
||||
// println('Cleaned up persistent keys.')
|
||||
|
||||
println('\nHeroCrypt example finished successfully!')
|
||||
3
examples/hero/herofs/.gitignore
vendored
Normal file
3
examples/hero/herofs/.gitignore
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
herofs_basic
|
||||
herofs_server
|
||||
fs_server
|
||||
34
examples/hero/herofs/fs_server.vsh
Executable file
34
examples/hero/herofs/fs_server.vsh
Executable file
@@ -0,0 +1,34 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused run
|
||||
|
||||
import freeflowuniverse.herolib.hero.herofs_server
|
||||
import freeflowuniverse.herolib.ui.console
|
||||
|
||||
fn main() {
|
||||
console.print_header('HeroFS REST API Server Example')
|
||||
|
||||
// Create server with CORS enabled for development
|
||||
mut server := herofs_server.new(
|
||||
port: 8080
|
||||
host: 'localhost'
|
||||
cors_enabled: true
|
||||
allowed_origins: ['*'] // Allow all origins for development
|
||||
)!
|
||||
|
||||
console.print_item('Server configured successfully')
|
||||
console.print_item('Starting server...')
|
||||
console.print_item('')
|
||||
console.print_item('Available endpoints:')
|
||||
console.print_item(' Health check: GET http://localhost:8080/health')
|
||||
console.print_item(' API info: GET http://localhost:8080/api')
|
||||
console.print_item(' Filesystems: http://localhost:8080/api/fs')
|
||||
console.print_item(' Directories: http://localhost:8080/api/dirs')
|
||||
console.print_item(' Files: http://localhost:8080/api/files')
|
||||
console.print_item(' Blobs: http://localhost:8080/api/blobs')
|
||||
console.print_item(' Symlinks: http://localhost:8080/api/symlinks')
|
||||
console.print_item(' Tools: http://localhost:8080/api/tools')
|
||||
console.print_item('')
|
||||
console.print_item('Press Ctrl+C to stop the server')
|
||||
|
||||
// Start the server (this blocks)
|
||||
server.start()!
|
||||
}
|
||||
208
examples/hero/herofs/fs_tools_example.vsh
Executable file
208
examples/hero/herofs/fs_tools_example.vsh
Executable file
@@ -0,0 +1,208 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused run
|
||||
|
||||
import freeflowuniverse.herolib.hero.herofs
|
||||
|
||||
// Example demonstrating the new FsTools high-level filesystem operations
|
||||
// This shows how to use find, cp, rm, and mv operations
|
||||
|
||||
fn main() {
|
||||
// Initialize the HeroFS factory
|
||||
mut fs_factory := herofs.new()!
|
||||
println('HeroFS factory initialized')
|
||||
|
||||
// Create a new filesystem
|
||||
mut my_fs := fs_factory.fs.new(
|
||||
name: 'tools_demo'
|
||||
description: 'Demonstration filesystem for fs_tools'
|
||||
quota_bytes: 1024 * 1024 * 1024 // 1GB quota
|
||||
)!
|
||||
|
||||
// Save the filesystem
|
||||
my_fs = fs_factory.fs.set(my_fs)!
|
||||
println('Created filesystem: ${my_fs.name} with ID: ${my_fs.id}')
|
||||
|
||||
// Create root directory
|
||||
mut root_dir := fs_factory.fs_dir.new(
|
||||
name: 'root'
|
||||
fs_id: my_fs.id
|
||||
parent_id: 0
|
||||
description: 'Root directory'
|
||||
)!
|
||||
root_dir = fs_factory.fs_dir.set(root_dir)!
|
||||
|
||||
// Update the filesystem with the root directory ID
|
||||
my_fs.root_dir_id = root_dir.id
|
||||
my_fs = fs_factory.fs.set(my_fs)!
|
||||
|
||||
// Create some sample directory structure
|
||||
println('\nCreating sample directory structure...')
|
||||
|
||||
// Create directories using the high-level tools (which will use create_path)
|
||||
src_dir_id := fs_factory.fs_dir.create_path(my_fs.id, '/src')!
|
||||
_ := fs_factory.fs_dir.create_path(my_fs.id, '/docs')!
|
||||
test_dir_id := fs_factory.fs_dir.create_path(my_fs.id, '/tests')!
|
||||
examples_dir_id := fs_factory.fs_dir.create_path(my_fs.id, '/examples')!
|
||||
|
||||
// Create some sample files
|
||||
println('Creating sample files...')
|
||||
|
||||
// Create blobs for file content
|
||||
v_code := 'fn main() {\n println("Hello from V!")\n}\n'.bytes()
|
||||
mut v_blob := fs_factory.fs_blob.new(data: v_code)!
|
||||
v_blob = fs_factory.fs_blob.set(v_blob)!
|
||||
|
||||
readme_content := '# My Project\n\nThis is a sample project.\n\n## Features\n\n- Feature 1\n- Feature 2\n'.bytes()
|
||||
mut readme_blob := fs_factory.fs_blob.new(data: readme_content)!
|
||||
readme_blob = fs_factory.fs_blob.set(readme_blob)!
|
||||
|
||||
test_content := 'fn test_main() {\n assert 1 == 1\n}\n'.bytes()
|
||||
mut test_blob := fs_factory.fs_blob.new(data: test_content)!
|
||||
test_blob = fs_factory.fs_blob.set(test_blob)!
|
||||
|
||||
// Create files
|
||||
mut main_file := fs_factory.fs_file.new(
|
||||
name: 'main.v'
|
||||
fs_id: my_fs.id
|
||||
blobs: [v_blob.id]
|
||||
mime_type: .txt
|
||||
)!
|
||||
main_file = fs_factory.fs_file.set(main_file)!
|
||||
fs_factory.fs_file.add_to_directory(main_file.id, src_dir_id)!
|
||||
|
||||
mut readme_file := fs_factory.fs_file.new(
|
||||
name: 'README.md'
|
||||
fs_id: my_fs.id
|
||||
blobs: [readme_blob.id]
|
||||
mime_type: .md
|
||||
)!
|
||||
readme_file = fs_factory.fs_file.set(readme_file)!
|
||||
fs_factory.fs_file.add_to_directory(readme_file.id, root_dir.id)!
|
||||
|
||||
mut test_file := fs_factory.fs_file.new(
|
||||
name: 'main_test.v'
|
||||
fs_id: my_fs.id
|
||||
blobs: [test_blob.id]
|
||||
mime_type: .txt
|
||||
)!
|
||||
test_file = fs_factory.fs_file.set(test_file)!
|
||||
fs_factory.fs_file.add_to_directory(test_file.id, test_dir_id)!
|
||||
|
||||
// Create a symbolic link
|
||||
mut main_symlink := fs_factory.fs_symlink.new(
|
||||
name: 'main_link.v'
|
||||
fs_id: my_fs.id
|
||||
parent_id: examples_dir_id
|
||||
target_id: main_file.id
|
||||
target_type: .file
|
||||
description: 'Link to main.v'
|
||||
)!
|
||||
main_symlink = fs_factory.fs_symlink.set(main_symlink)!
|
||||
|
||||
println('Sample filesystem structure created!')
|
||||
|
||||
// Get the filesystem instance for tools operations
|
||||
mut fs := fs_factory.fs.get(my_fs.id)!
|
||||
|
||||
// Demonstrate FIND functionality
|
||||
println('\n=== FIND OPERATIONS ===')
|
||||
|
||||
// Find all files
|
||||
println('\nFinding all files...')
|
||||
all_results := fs.find('/', recursive: true)!
|
||||
for result in all_results {
|
||||
type_str := match result.result_type {
|
||||
.file { 'FILE' }
|
||||
.directory { 'DIR ' }
|
||||
.symlink { 'LINK' }
|
||||
}
|
||||
println('${type_str}: ${result.path} (ID: ${result.id})')
|
||||
}
|
||||
|
||||
// Find only V files
|
||||
println('\nFinding only .v files...')
|
||||
v_files := fs.find('/', recursive: true, include_patterns: ['*.v'])!
|
||||
for result in v_files {
|
||||
println('V FILE: ${result.path}')
|
||||
}
|
||||
|
||||
// Find with exclude patterns
|
||||
println('\nFinding all except test files...')
|
||||
non_test_results := fs.find('/',
|
||||
recursive: true
|
||||
exclude_patterns: [
|
||||
'*test*',
|
||||
]
|
||||
)!
|
||||
for result in non_test_results {
|
||||
type_str := match result.result_type {
|
||||
.file { 'FILE' }
|
||||
.directory { 'DIR ' }
|
||||
.symlink { 'LINK' }
|
||||
}
|
||||
println('${type_str}: ${result.path}')
|
||||
}
|
||||
|
||||
// Demonstrate COPY functionality
|
||||
println('\n=== COPY OPERATIONS ===')
|
||||
|
||||
// Copy a single file
|
||||
println('Copying /src/main.v to /docs/')
|
||||
fs.cp('/src/main.v', '/docs/', herofs.FindOptions{ recursive: false }, herofs.CopyOptions{
|
||||
overwrite: true
|
||||
copy_blobs: true
|
||||
})!
|
||||
|
||||
// Copy all V files to examples directory
|
||||
println('Copying all .v files to /examples/')
|
||||
fs.cp('/', '/examples/', herofs.FindOptions{
|
||||
recursive: true
|
||||
include_patterns: [
|
||||
'*.v',
|
||||
]
|
||||
}, herofs.CopyOptions{
|
||||
overwrite: true
|
||||
copy_blobs: false
|
||||
})! // Reference same blobs
|
||||
|
||||
// Demonstrate MOVE functionality
|
||||
println('\n=== MOVE OPERATIONS ===')
|
||||
|
||||
// Move the copied file to a new location with rename
|
||||
println('Moving /docs/main.v to /examples/main_backup.v')
|
||||
fs.mv('/docs/main.v', '/examples/main_backup.v', herofs.MoveOptions{ overwrite: true })!
|
||||
|
||||
// Move README to root
|
||||
println('Moving /README.md to /project_readme.md')
|
||||
fs.mv('/README.md', '/project_readme.md', herofs.MoveOptions{ overwrite: false })!
|
||||
|
||||
// Demonstrate REMOVE functionality
|
||||
println('\n=== REMOVE OPERATIONS ===')
|
||||
|
||||
// Remove a specific file
|
||||
println('Removing /tests/main_test.v')
|
||||
fs.rm('/tests/main_test.v', herofs.FindOptions{ recursive: false }, herofs.RemoveOptions{
|
||||
delete_blobs: false
|
||||
})!
|
||||
|
||||
// Remove all files in docs directory (but keep the directory)
|
||||
println('Removing all files in /docs/ directory')
|
||||
fs.rm('/docs/', herofs.FindOptions{ recursive: false, include_patterns: ['*'] }, herofs.RemoveOptions{
|
||||
delete_blobs: false
|
||||
})!
|
||||
|
||||
println('\nAll copy, move, and remove operations completed successfully!')
|
||||
|
||||
// Show final filesystem state
|
||||
println('\n=== FINAL FILESYSTEM STATE ===')
|
||||
final_results := fs.find('/', recursive: true)!
|
||||
for result in final_results {
|
||||
type_str := match result.result_type {
|
||||
.file { 'FILE' }
|
||||
.directory { 'DIR ' }
|
||||
.symlink { 'LINK' }
|
||||
}
|
||||
println('${type_str}: ${result.path} (ID: ${result.id})')
|
||||
}
|
||||
|
||||
println('\nfs_tools demonstration completed successfully!')
|
||||
}
|
||||
@@ -1,9 +1,6 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals run
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused run
|
||||
|
||||
import freeflowuniverse.herolib.core.redisclient
|
||||
import freeflowuniverse.herolib.hero.herofs
|
||||
import time
|
||||
import os
|
||||
|
||||
// Advanced example of using HeroFS - the Hero Filesystem
|
||||
// Demonstrates more complex operations including:
|
||||
@@ -25,25 +22,25 @@ fn main() {
|
||||
quota_bytes: 5 * 1024 * 1024 * 1024 // 5GB quota
|
||||
)!
|
||||
|
||||
// Save the filesystem to get an ID
|
||||
fs_id := fs_factory.fs.set(my_fs)!
|
||||
println('Created filesystem: ${my_fs.name} with ID: ${fs_id}')
|
||||
// Save the filesystem
|
||||
my_fs = fs_factory.fs.set(my_fs)!
|
||||
println('Created filesystem: ${my_fs.name} with ID: ${my_fs.id}')
|
||||
|
||||
// Create root directory
|
||||
mut root_dir := fs_factory.fs_dir.new(
|
||||
name: 'root'
|
||||
fs_id: fs_id
|
||||
fs_id: my_fs.id
|
||||
parent_id: 0 // Root has no parent
|
||||
description: 'Root directory'
|
||||
)!
|
||||
|
||||
// Save the root directory
|
||||
root_dir_id := fs_factory.fs_dir.set(root_dir)!
|
||||
println('Created root directory with ID: ${root_dir_id}')
|
||||
root_dir = fs_factory.fs_dir.set(root_dir)!
|
||||
println('Created root directory with ID: ${root_dir.id}')
|
||||
|
||||
// Update the filesystem with the root directory ID
|
||||
my_fs.root_dir_id = root_dir_id
|
||||
fs_factory.fs.set(my_fs)!
|
||||
my_fs.root_dir_id = root_dir.id
|
||||
my_fs = fs_factory.fs.set(my_fs)!
|
||||
|
||||
// Create a directory hierarchy
|
||||
println('\nCreating directory hierarchy...')
|
||||
@@ -51,44 +48,56 @@ fn main() {
|
||||
// Main project directories
|
||||
mut src_dir := fs_factory.fs_dir.new(
|
||||
name: 'src'
|
||||
fs_id: fs_id
|
||||
parent_id: root_dir_id
|
||||
fs_id: my_fs.id
|
||||
parent_id: root_dir.id
|
||||
description: 'Source code'
|
||||
)!
|
||||
src_dir_id := fs_factory.fs_dir.set(src_dir)!
|
||||
src_dir = fs_factory.fs_dir.set(src_dir)!
|
||||
|
||||
mut docs_dir := fs_factory.fs_dir.new(
|
||||
name: 'docs'
|
||||
fs_id: fs_id
|
||||
parent_id: root_dir_id
|
||||
fs_id: my_fs.id
|
||||
parent_id: root_dir.id
|
||||
description: 'Documentation'
|
||||
)!
|
||||
docs_dir_id := fs_factory.fs_dir.set(docs_dir)!
|
||||
docs_dir = fs_factory.fs_dir.set(docs_dir)!
|
||||
|
||||
mut assets_dir := fs_factory.fs_dir.new(
|
||||
name: 'assets'
|
||||
fs_id: fs_id
|
||||
parent_id: root_dir_id
|
||||
fs_id: my_fs.id
|
||||
parent_id: root_dir.id
|
||||
description: 'Project assets'
|
||||
)!
|
||||
assets_dir_id := fs_factory.fs_dir.set(assets_dir)!
|
||||
assets_dir = fs_factory.fs_dir.set(assets_dir)!
|
||||
|
||||
// Subdirectories
|
||||
mut images_dir := fs_factory.fs_dir.new(
|
||||
name: 'images'
|
||||
fs_id: fs_id
|
||||
parent_id: assets_dir_id
|
||||
fs_id: my_fs.id
|
||||
parent_id: assets_dir.id
|
||||
description: 'Image assets'
|
||||
)!
|
||||
images_dir_id := fs_factory.fs_dir.set(images_dir)!
|
||||
images_dir = fs_factory.fs_dir.set(images_dir)!
|
||||
|
||||
mut api_docs_dir := fs_factory.fs_dir.new(
|
||||
name: 'api'
|
||||
fs_id: fs_id
|
||||
parent_id: docs_dir_id
|
||||
fs_id: my_fs.id
|
||||
parent_id: docs_dir.id
|
||||
description: 'API documentation'
|
||||
)!
|
||||
api_docs_dir_id := fs_factory.fs_dir.set(api_docs_dir)!
|
||||
api_docs_dir = fs_factory.fs_dir.set(api_docs_dir)!
|
||||
|
||||
// Add directories to their parents
|
||||
root_dir.directories << src_dir.id
|
||||
root_dir.directories << docs_dir.id
|
||||
root_dir.directories << assets_dir.id
|
||||
root_dir = fs_factory.fs_dir.set(root_dir)!
|
||||
|
||||
assets_dir.directories << images_dir.id
|
||||
assets_dir = fs_factory.fs_dir.set(assets_dir)!
|
||||
|
||||
docs_dir.directories << api_docs_dir.id
|
||||
docs_dir = fs_factory.fs_dir.set(docs_dir)!
|
||||
|
||||
println('Directory hierarchy created successfully')
|
||||
|
||||
@@ -97,67 +106,55 @@ fn main() {
|
||||
|
||||
// Text file for source code
|
||||
code_content := 'fn main() {\n println("Hello, HeroFS!")\n}\n'.bytes()
|
||||
mut code_blob := fs_factory.fs_blob.new(
|
||||
data: code_content
|
||||
mime_type: 'text/plain'
|
||||
name: 'main.v blob'
|
||||
)!
|
||||
code_blob_id := fs_factory.fs_blob.set(code_blob)!
|
||||
mut code_blob := fs_factory.fs_blob.new(data: code_content)!
|
||||
code_blob = fs_factory.fs_blob.set(code_blob)!
|
||||
|
||||
mut code_file := fs_factory.fs_file.new(
|
||||
name: 'main.v'
|
||||
fs_id: fs_id
|
||||
directories: [src_dir_id]
|
||||
blobs: [code_blob_id]
|
||||
mime_type: 'text/plain'
|
||||
metadata: {
|
||||
name: 'main.v'
|
||||
fs_id: my_fs.id
|
||||
blobs: [code_blob.id]
|
||||
mime_type: .txt
|
||||
metadata: {
|
||||
'language': 'vlang'
|
||||
'version': '0.3.3'
|
||||
}
|
||||
)!
|
||||
code_file_id := fs_factory.fs_file.set(code_file)!
|
||||
code_file = fs_factory.fs_file.set(code_file)!
|
||||
fs_factory.fs_file.add_to_directory(code_file.id, src_dir.id)!
|
||||
|
||||
// Markdown documentation file
|
||||
docs_content := '# API Documentation\n\n## Endpoints\n\n- GET /api/v1/users\n- POST /api/v1/users\n'.bytes()
|
||||
mut docs_blob := fs_factory.fs_blob.new(
|
||||
data: docs_content
|
||||
mime_type: 'text/markdown'
|
||||
name: 'api.md blob'
|
||||
)!
|
||||
docs_blob_id := fs_factory.fs_blob.set(docs_blob)!
|
||||
mut docs_blob := fs_factory.fs_blob.new(data: docs_content)!
|
||||
docs_blob = fs_factory.fs_blob.set(docs_blob)!
|
||||
|
||||
mut docs_file := fs_factory.fs_file.new(
|
||||
name: 'api.md'
|
||||
fs_id: fs_id
|
||||
directories: [api_docs_dir_id]
|
||||
blobs: [docs_blob_id]
|
||||
mime_type: 'text/markdown'
|
||||
name: 'api.md'
|
||||
fs_id: my_fs.id
|
||||
blobs: [docs_blob.id]
|
||||
mime_type: .md
|
||||
)!
|
||||
docs_file_id := fs_factory.fs_file.set(docs_file)!
|
||||
docs_file = fs_factory.fs_file.set(docs_file)!
|
||||
fs_factory.fs_file.add_to_directory(docs_file.id, api_docs_dir.id)!
|
||||
|
||||
// Create a binary file (sample image)
|
||||
// For this example, we'll just create random bytes
|
||||
mut image_data := []u8{len: 1024, init: u8(index % 256)}
|
||||
mut image_blob := fs_factory.fs_blob.new(
|
||||
data: image_data
|
||||
mime_type: 'image/png'
|
||||
name: 'logo.png blob'
|
||||
)!
|
||||
image_blob_id := fs_factory.fs_blob.set(image_blob)!
|
||||
mut image_blob := fs_factory.fs_blob.new(data: image_data)!
|
||||
image_blob = fs_factory.fs_blob.set(image_blob)!
|
||||
|
||||
mut image_file := fs_factory.fs_file.new(
|
||||
name: 'logo.png'
|
||||
fs_id: fs_id
|
||||
directories: [images_dir_id]
|
||||
blobs: [image_blob_id]
|
||||
mime_type: 'image/png'
|
||||
metadata: {
|
||||
name: 'logo.png'
|
||||
fs_id: my_fs.id
|
||||
blobs: [image_blob.id]
|
||||
mime_type: .png
|
||||
metadata: {
|
||||
'width': '200'
|
||||
'height': '100'
|
||||
'format': 'PNG'
|
||||
}
|
||||
)!
|
||||
image_file_id := fs_factory.fs_file.set(image_file)!
|
||||
image_file = fs_factory.fs_file.set(image_file)!
|
||||
fs_factory.fs_file.add_to_directory(image_file.id, images_dir.id)!
|
||||
|
||||
println('Files created successfully')
|
||||
|
||||
@@ -167,110 +164,151 @@ fn main() {
|
||||
// Symlink to the API docs from the root directory
|
||||
mut api_symlink := fs_factory.fs_symlink.new(
|
||||
name: 'api-docs'
|
||||
fs_id: fs_id
|
||||
parent_id: root_dir_id
|
||||
target_id: api_docs_dir_id
|
||||
fs_id: my_fs.id
|
||||
parent_id: root_dir.id
|
||||
target_id: api_docs_dir.id
|
||||
target_type: .directory
|
||||
description: 'Shortcut to API documentation'
|
||||
)!
|
||||
api_symlink_id := fs_factory.fs_symlink.set(api_symlink)!
|
||||
api_symlink = fs_factory.fs_symlink.set(api_symlink)!
|
||||
|
||||
// Symlink to the logo from the docs directory
|
||||
mut logo_symlink := fs_factory.fs_symlink.new(
|
||||
name: 'logo.png'
|
||||
fs_id: fs_id
|
||||
parent_id: docs_dir_id
|
||||
target_id: image_file_id
|
||||
fs_id: my_fs.id
|
||||
parent_id: docs_dir.id
|
||||
target_id: image_file.id
|
||||
target_type: .file
|
||||
description: 'Shortcut to project logo'
|
||||
)!
|
||||
logo_symlink_id := fs_factory.fs_symlink.set(logo_symlink)!
|
||||
logo_symlink = fs_factory.fs_symlink.set(logo_symlink)!
|
||||
|
||||
// Add symlinks to their parent directories
|
||||
root_dir.symlinks << api_symlink.id
|
||||
root_dir = fs_factory.fs_dir.set(root_dir)!
|
||||
|
||||
docs_dir.symlinks << logo_symlink.id
|
||||
docs_dir = fs_factory.fs_dir.set(docs_dir)!
|
||||
|
||||
println('Symlinks created successfully')
|
||||
|
||||
// Demonstrate file operations
|
||||
println('\nDemonstrating file operations...')
|
||||
// Demonstrate filesystem navigation using find
|
||||
println('\nDemonstrating filesystem navigation...')
|
||||
|
||||
// 1. Move a file to multiple directories (hard link-like behavior)
|
||||
println('Moving logo.png to both images and docs directories...')
|
||||
image_file = fs_factory.fs_file.get(image_file_id)!
|
||||
fs_factory.fs_file.move(image_file_id, [images_dir_id, docs_dir_id])!
|
||||
image_file = fs_factory.fs_file.get(image_file_id)!
|
||||
// Get the filesystem instance for navigation
|
||||
mut fs := fs_factory.fs.get(my_fs.id)!
|
||||
|
||||
// 2. Rename a file
|
||||
println('Renaming main.v to app.v...')
|
||||
fs_factory.fs_file.rename(code_file_id, 'app.v')!
|
||||
code_file = fs_factory.fs_file.get(code_file_id)!
|
||||
// Find all items in the filesystem
|
||||
results := fs.find('/', recursive: true)!
|
||||
println('Complete filesystem structure:')
|
||||
for result in results {
|
||||
type_str := match result.result_type {
|
||||
.file { 'FILE' }
|
||||
.directory { 'DIR ' }
|
||||
.symlink { 'LINK' }
|
||||
}
|
||||
println('${type_str}: ${result.path} (ID: ${result.id})')
|
||||
}
|
||||
|
||||
// 3. Update file metadata
|
||||
// Find specific file types
|
||||
println('\nFinding specific file types...')
|
||||
v_files := fs.find('/', include_patterns: ['*.v'], recursive: true)!
|
||||
println('V source files:')
|
||||
for file in v_files {
|
||||
println(' ${file.path}')
|
||||
}
|
||||
|
||||
md_files := fs.find('/', include_patterns: ['*.md'], recursive: true)!
|
||||
println('Markdown files:')
|
||||
for file in md_files {
|
||||
println(' ${file.path}')
|
||||
}
|
||||
|
||||
// Find files in specific directories
|
||||
println('\nFinding files in specific directories...')
|
||||
src_files := fs.find('/src', recursive: true)!
|
||||
println('Files in src directory:')
|
||||
for file in src_files {
|
||||
println(' ${file.path}')
|
||||
}
|
||||
|
||||
// Demonstrate advanced file operations
|
||||
println('\nDemonstrating advanced file operations...')
|
||||
|
||||
// Update file metadata
|
||||
println('Updating file metadata...')
|
||||
fs_factory.fs_file.update_metadata(docs_file_id, 'status', 'draft')!
|
||||
fs_factory.fs_file.update_metadata(docs_file_id, 'author', 'HeroFS Team')!
|
||||
fs_factory.fs_file.update_metadata(docs_file.id, 'status', 'draft')!
|
||||
fs_factory.fs_file.update_metadata(docs_file.id, 'author', 'HeroFS Team')!
|
||||
|
||||
// 4. Update file access time when "reading" it
|
||||
// Update access time
|
||||
println('Updating file access time...')
|
||||
fs_factory.fs_file.update_accessed(docs_file_id)!
|
||||
fs_factory.fs_file.update_accessed(docs_file.id)!
|
||||
|
||||
// 5. Add additional content to a file (append a blob)
|
||||
// Rename a file
|
||||
println('Renaming main.v to app.v...')
|
||||
fs_factory.fs_file.rename(code_file.id, 'app.v')!
|
||||
|
||||
// Append content to a file
|
||||
println('Appending content to API docs...')
|
||||
additional_content := '\n## Authentication\n\nUse Bearer token for authentication.\n'.bytes()
|
||||
mut additional_blob := fs_factory.fs_blob.new(
|
||||
data: additional_content
|
||||
mime_type: 'text/markdown'
|
||||
name: 'api_append.md blob'
|
||||
)!
|
||||
additional_blob_id := fs_factory.fs_blob.set(additional_blob)!
|
||||
fs_factory.fs_file.append_blob(docs_file_id, additional_blob_id)!
|
||||
mut additional_blob := fs_factory.fs_blob.new(data: additional_content)!
|
||||
additional_blob = fs_factory.fs_blob.set(additional_blob)!
|
||||
fs_factory.fs_file.append_blob(docs_file.id, additional_blob.id)!
|
||||
|
||||
// Demonstrate directory operations
|
||||
println('\nDemonstrating directory operations...')
|
||||
|
||||
// 1. Create a new directory and move it
|
||||
// Create a temporary directory
|
||||
mut temp_dir := fs_factory.fs_dir.new(
|
||||
name: 'temp'
|
||||
fs_id: fs_id
|
||||
parent_id: root_dir_id
|
||||
fs_id: my_fs.id
|
||||
parent_id: root_dir.id
|
||||
description: 'Temporary directory'
|
||||
)!
|
||||
temp_dir_id := fs_factory.fs_dir.set(temp_dir)!
|
||||
temp_dir = fs_factory.fs_dir.set(temp_dir)!
|
||||
|
||||
println('Moving temp directory to be under docs...')
|
||||
fs_factory.fs_dir.move(temp_dir_id, docs_dir_id)!
|
||||
// Add to parent
|
||||
root_dir.directories << temp_dir.id
|
||||
root_dir = fs_factory.fs_dir.set(root_dir)!
|
||||
|
||||
// 2. Rename a directory
|
||||
// Move temp directory under docs
|
||||
println('Moving temp directory under docs...')
|
||||
fs_factory.fs_dir.move(temp_dir.id, docs_dir.id)!
|
||||
|
||||
// Rename temp directory to drafts
|
||||
println('Renaming temp directory to drafts...')
|
||||
fs_factory.fs_dir.rename(temp_dir_id, 'drafts')!
|
||||
fs_factory.fs_dir.rename(temp_dir.id, 'drafts')!
|
||||
|
||||
// 3. Check if a directory has children
|
||||
has_children := fs_factory.fs_dir.has_children(docs_dir_id)!
|
||||
// Check if docs directory has children
|
||||
has_children := fs_factory.fs_dir.has_children(docs_dir.id)!
|
||||
println('Does docs directory have children? ${has_children}')
|
||||
|
||||
// Demonstrate searching and filtering
|
||||
println('\nDemonstrating searching and filtering...')
|
||||
// Demonstrate listing operations
|
||||
println('\nDemonstrating listing operations...')
|
||||
|
||||
// 1. List all files in the filesystem
|
||||
all_files := fs_factory.fs_file.list_by_filesystem(fs_id)!
|
||||
// List all files in filesystem
|
||||
all_files := fs_factory.fs_file.list_by_filesystem(my_fs.id)!
|
||||
println('All files in filesystem (${all_files.len}):')
|
||||
for file in all_files {
|
||||
println('- ${file.name} (ID: ${file.id})')
|
||||
}
|
||||
|
||||
// 2. List files by MIME type
|
||||
markdown_files := fs_factory.fs_file.list_by_mime_type('text/markdown')!
|
||||
println('\nMarkdown files (${markdown_files.len}):')
|
||||
for file in markdown_files {
|
||||
// List files by MIME type
|
||||
md_files_by_type := fs_factory.fs_file.list_by_mime_type(.md)!
|
||||
println('\nMarkdown files (${md_files_by_type.len}):')
|
||||
for file in md_files_by_type {
|
||||
println('- ${file.name} (ID: ${file.id})')
|
||||
}
|
||||
|
||||
// 3. List all symlinks
|
||||
all_symlinks := fs_factory.fs_symlink.list_by_filesystem(fs_id)!
|
||||
// List all symlinks
|
||||
all_symlinks := fs_factory.fs_symlink.list_by_filesystem(my_fs.id)!
|
||||
println('\nAll symlinks (${all_symlinks.len}):')
|
||||
for symlink in all_symlinks {
|
||||
target_type_str := if symlink.target_type == .file { 'file' } else { 'directory' }
|
||||
println('- ${symlink.name} -> ${symlink.target_id} (${target_type_str})')
|
||||
}
|
||||
|
||||
// 4. Check for broken symlinks
|
||||
// Check for broken symlinks
|
||||
println('\nChecking for broken symlinks:')
|
||||
for symlink in all_symlinks {
|
||||
is_broken := fs_factory.fs_symlink.is_broken(symlink.id)!
|
||||
@@ -281,11 +319,11 @@ fn main() {
|
||||
println('\nDemonstrating file content retrieval:')
|
||||
|
||||
// Get the updated API docs file and print its content
|
||||
docs_file = fs_factory.fs_file.get(docs_file_id)!
|
||||
println('Content of ${docs_file.name}:')
|
||||
updated_docs_file := fs_factory.fs_file.get(docs_file.id)!
|
||||
println('Content of ${updated_docs_file.name}:')
|
||||
mut full_content := ''
|
||||
|
||||
for blob_id in docs_file.blobs {
|
||||
for blob_id in updated_docs_file.blobs {
|
||||
blob := fs_factory.fs_blob.get(blob_id)!
|
||||
full_content += blob.data.bytestr()
|
||||
}
|
||||
@@ -294,12 +332,23 @@ fn main() {
|
||||
println(full_content)
|
||||
println('---END CONTENT---')
|
||||
|
||||
// Print filesystem usage
|
||||
println('\nFilesystem usage:')
|
||||
my_fs = fs_factory.fs.get(fs_id)!
|
||||
println('Used: ${my_fs.used_bytes} bytes')
|
||||
println('Quota: ${my_fs.quota_bytes} bytes')
|
||||
println('Available: ${my_fs.quota_bytes - my_fs.used_bytes} bytes')
|
||||
// Print filesystem information
|
||||
println('\nFilesystem information:')
|
||||
println('Filesystem: ${my_fs.name}')
|
||||
println('Description: ${my_fs.description}')
|
||||
println('Root directory ID: ${my_fs.root_dir_id}')
|
||||
|
||||
println('\nHeroFS advanced example completed successfully!')
|
||||
println('\n=== HeroFS Advanced Example Completed Successfully! ===')
|
||||
println('This example demonstrated:')
|
||||
println('- Creating a complex directory hierarchy')
|
||||
println('- Creating files with different content types (text, markdown, binary)')
|
||||
println('- Creating symbolic links')
|
||||
println('- Using the find functionality to navigate the filesystem')
|
||||
println('- Advanced file operations: rename, metadata updates, append content')
|
||||
println('- Advanced directory operations: move, rename, check children')
|
||||
println('- Listing operations: files by filesystem, files by MIME type, symlinks')
|
||||
println('- Symlink validation: checking for broken links')
|
||||
println('- Retrieving and displaying file content')
|
||||
|
||||
println('\nAll advanced HeroFS operations are now fully implemented!')
|
||||
}
|
||||
|
||||
@@ -1,109 +1,147 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals run
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused run
|
||||
|
||||
import freeflowuniverse.herolib.core.redisclient
|
||||
import freeflowuniverse.herolib.hero.herofs
|
||||
|
||||
// Basic example of using HeroFS - the Hero Filesystem
|
||||
// Demonstrates creating a filesystem, directories, and files
|
||||
|
||||
fn main() {
|
||||
// Initialize the HeroFS factory
|
||||
mut fs_factory := herofs.new()!
|
||||
println('HeroFS factory initialized')
|
||||
fn test_cleanup() ! {
|
||||
herofs.delete_fs_test()!
|
||||
}
|
||||
|
||||
// Create a new filesystem
|
||||
mut my_fs := fs_factory.fs.new(
|
||||
name: 'my_documents'
|
||||
description: 'Personal documents filesystem'
|
||||
fn test_basic() ! {
|
||||
defer {
|
||||
test_cleanup() or { panic('cleanup failed: ${err.msg()}') }
|
||||
}
|
||||
|
||||
test_cleanup()!
|
||||
|
||||
// Initialize the HeroFS factory for test purposes
|
||||
mut fs_factory := herofs.new()!
|
||||
|
||||
// Create a new filesystem (required for FsBlobMembership validation)
|
||||
mut test_fs := fs_factory.fs.new_get_set(
|
||||
name: 'test_filesystem'
|
||||
description: 'Filesystem for testing FsBlobMembership functionality'
|
||||
quota_bytes: 1024 * 1024 * 1024 // 1GB quota
|
||||
)!
|
||||
println('Created test filesystem with ID: ${test_fs.id}')
|
||||
|
||||
// Save the filesystem to get an ID
|
||||
fs_id := fs_factory.fs.set(my_fs)!
|
||||
println('Created filesystem: ${my_fs.name} with ID: ${fs_id}')
|
||||
|
||||
// Create root directory
|
||||
mut root_dir := fs_factory.fs_dir.new(
|
||||
name: 'root'
|
||||
fs_id: fs_id
|
||||
parent_id: 0 // Root has no parent
|
||||
description: 'Root directory'
|
||||
)!
|
||||
assert test_fs.id > 0
|
||||
assert test_fs.root_dir_id > 0
|
||||
|
||||
// Save the root directory
|
||||
root_dir_id := fs_factory.fs_dir.set(root_dir)!
|
||||
println('Created root directory with ID: ${root_dir_id}')
|
||||
mut root_dir := test_fs.root_dir()!
|
||||
|
||||
// Update the filesystem with the root directory ID
|
||||
my_fs.root_dir_id = root_dir_id
|
||||
fs_factory.fs.set(my_fs)!
|
||||
// this means root_dir is automatically there, no need to create
|
||||
|
||||
// Create some subdirectories
|
||||
mut docs_dir := fs_factory.fs_dir.new(
|
||||
name: 'documents'
|
||||
fs_id: fs_id
|
||||
parent_id: root_dir_id
|
||||
description: 'Documents directory'
|
||||
)!
|
||||
|
||||
mut pics_dir := fs_factory.fs_dir.new(
|
||||
name: 'pictures'
|
||||
fs_id: fs_id
|
||||
parent_id: root_dir_id
|
||||
description: 'Pictures directory'
|
||||
)!
|
||||
|
||||
// Save the subdirectories
|
||||
docs_dir_id := fs_factory.fs_dir.set(docs_dir)!
|
||||
pics_dir_id := fs_factory.fs_dir.set(pics_dir)!
|
||||
println('Created documents directory with ID: ${docs_dir_id}')
|
||||
println('Created pictures directory with ID: ${pics_dir_id}')
|
||||
|
||||
// Create a text file blob
|
||||
text_content := 'Hello, world! This is a test file in HeroFS.'.bytes()
|
||||
mut text_blob := fs_factory.fs_blob.new(
|
||||
data: text_content
|
||||
mime_type: 'text/plain'
|
||||
name: 'hello.txt blob'
|
||||
)!
|
||||
|
||||
// Save the blob
|
||||
blob_id := fs_factory.fs_blob.set(text_blob)!
|
||||
println('Created text blob with ID: ${blob_id}')
|
||||
|
||||
// Create a file referencing the blob
|
||||
mut text_file := fs_factory.fs_file.new(
|
||||
name: 'hello.txt'
|
||||
fs_id: fs_id
|
||||
directories: [docs_dir_id]
|
||||
blobs: [blob_id]
|
||||
mime_type: 'text/plain'
|
||||
)!
|
||||
|
||||
// Save the file
|
||||
file_id := fs_factory.fs_file.set(text_file)!
|
||||
println('Created text file with ID: ${file_id}')
|
||||
|
||||
// List all directories in the filesystem
|
||||
dirs := fs_factory.fs_dir.list_by_filesystem(fs_id)!
|
||||
println('\nAll directories in filesystem:')
|
||||
for dir in dirs {
|
||||
println('- ${dir.name} (ID: ${dir.id})')
|
||||
}
|
||||
|
||||
// List all files in the documents directory
|
||||
files := fs_factory.fs_file.list_by_directory(docs_dir_id)!
|
||||
println('\nFiles in documents directory:')
|
||||
for file in files {
|
||||
println('- ${file.name} (ID: ${file.id}, Size: ${file.size_bytes} bytes)')
|
||||
|
||||
// Get the file's content from its blobs
|
||||
if file.blobs.len > 0 {
|
||||
blob := fs_factory.fs_blob.get(file.blobs[0])!
|
||||
content := blob.data.bytestr()
|
||||
println(' Content: "${content}"')
|
||||
}
|
||||
}
|
||||
|
||||
println('\nHeroFS basic example completed successfully!')
|
||||
println(root_dir)
|
||||
}
|
||||
|
||||
test_basic()!
|
||||
|
||||
// // Initialize the HeroFS factory
|
||||
// mut fs_factory := herofs.new()!
|
||||
// println('HeroFS factory initialized')
|
||||
|
||||
// // Create a new filesystem
|
||||
// mut my_fs := fs_factory.fs.new(
|
||||
// name: 'my_documents'
|
||||
// description: 'Personal documents filesystem'
|
||||
// quota_bytes: 1024 * 1024 * 1024 // 1GB quota
|
||||
// )!
|
||||
|
||||
// // Save the filesystem
|
||||
// fs_factory.fs.set(mut my_fs)!
|
||||
// println('Created filesystem: ${my_fs.name} with ID: ${my_fs.id}')
|
||||
|
||||
// // Create root directory
|
||||
// mut root_dir := fs_factory.fs_dir.new(
|
||||
// name: 'root'
|
||||
// fs_id: my_fs.id
|
||||
// parent_id: 0 // Root has no parent
|
||||
// description: 'Root directory'
|
||||
// )!
|
||||
|
||||
// // Save the root directory
|
||||
// fs_factory.fs_dir.set(mut root_dir)!
|
||||
// println('Created root directory with ID: ${root_dir.id}')
|
||||
|
||||
// // Update the filesystem with the root directory ID
|
||||
// my_fs.root_dir_id = root_dir.id
|
||||
// fs_factory.fs.set(mut my_fs)!
|
||||
|
||||
// // Create some subdirectories
|
||||
// mut docs_dir := fs_factory.fs_dir.new(
|
||||
// name: 'documents'
|
||||
// fs_id: my_fs.id
|
||||
// parent_id: root_dir.id
|
||||
// description: 'Documents directory'
|
||||
// )!
|
||||
|
||||
// mut pics_dir := fs_factory.fs_dir.new(
|
||||
// name: 'pictures'
|
||||
// fs_id: my_fs.id
|
||||
// parent_id: root_dir.id
|
||||
// description: 'Pictures directory'
|
||||
// )!
|
||||
|
||||
// // Save the subdirectories
|
||||
// fs_factory.fs_dir.set(mut docs_dir)!
|
||||
// fs_factory.fs_dir.set(mut pics_dir)!
|
||||
|
||||
// // Add subdirectories to root directory
|
||||
// root_dir.directories << docs_dir.id
|
||||
// root_dir.directories << pics_dir.id
|
||||
// fs_factory.fs_dir.set(mut root_dir)!
|
||||
|
||||
// println('Created documents directory with ID: ${docs_dir.id}')
|
||||
// println('Created pictures directory with ID: ${pics_dir.id}')
|
||||
|
||||
// // Create a text file blob
|
||||
// text_content := 'Hello, world! This is a test file in HeroFS.'.bytes()
|
||||
// mut text_blob := fs_factory.fs_blob.new(data: text_content)!
|
||||
|
||||
// // Save the blob
|
||||
// fs_factory.fs_blob.set(mut text_blob)!
|
||||
// println('Created text blob with ID: ${text_blob.id}')
|
||||
|
||||
// // Create a file referencing the blob
|
||||
// mut text_file := fs_factory.fs_file.new(
|
||||
// name: 'hello.txt'
|
||||
// fs_id: my_fs.id
|
||||
// blobs: [text_blob.id]
|
||||
// mime_type: .txt
|
||||
// )!
|
||||
|
||||
// // Save the file
|
||||
// fs_factory.fs_file.set(mut text_file)!
|
||||
// // Associate file with documents directory
|
||||
// fs_factory.fs_file.add_to_directory(text_file.id, docs_dir.id)!
|
||||
// println('Created text file with ID: ${text_file.id}')
|
||||
|
||||
// // Demonstrate filesystem navigation using find
|
||||
// mut fs := fs_factory.fs.get(my_fs.id)!
|
||||
|
||||
// println('\nAll items in filesystem:')
|
||||
// results := fs.find('/', recursive: true)!
|
||||
// for result in results {
|
||||
// type_str := match result.result_type {
|
||||
// .file { 'FILE' }
|
||||
// .directory { 'DIR ' }
|
||||
// .symlink { 'LINK' }
|
||||
// }
|
||||
// println('- ${type_str}: ${result.path} (ID: ${result.id})')
|
||||
|
||||
// // If it's a file, show its content
|
||||
// if result.result_type == .file {
|
||||
// file := fs_factory.fs_file.get(result.id)!
|
||||
// if file.blobs.len > 0 {
|
||||
// blob := fs_factory.fs_blob.get(file.blobs[0])!
|
||||
// content := blob.data.bytestr()
|
||||
// println(' Content: "${content}"')
|
||||
// }
|
||||
// }
|
||||
// }
|
||||
|
||||
// println('\nHeroFS basic example completed successfully!')
|
||||
|
||||
@@ -1,26 +0,0 @@
|
||||
#!/usr/bin/env vshell
|
||||
|
||||
// HeroFS RPC Example
|
||||
// This example demonstrates how to start the HeroFS RPC server
|
||||
|
||||
import freeflowuniverse.herolib.hero.herofs.rpc { ServerArgs, start }
|
||||
|
||||
fn main() {
|
||||
// Example 1: Start RPC server with Unix socket
|
||||
println('Starting HeroFS RPC server with Unix socket...')
|
||||
mut args := ServerArgs{
|
||||
socket_path: '/tmp/herofs'
|
||||
http_port: 0 // No HTTP server
|
||||
}
|
||||
start(args)!
|
||||
println('HeroFS RPC server started successfully on Unix socket: ${args.socket_path}')
|
||||
|
||||
// Example 2: Start RPC server with HTTP
|
||||
println('\nStarting HeroFS RPC server with HTTP on port 8080...')
|
||||
args = ServerArgs{
|
||||
socket_path: '/tmp/herofs'
|
||||
http_port: 8080
|
||||
}
|
||||
start(args)!
|
||||
println('HeroFS RPC server started successfully on HTTP port: ${args.http_port}')
|
||||
}
|
||||
210
examples/hero/herofs/import_export_example.vsh
Executable file
210
examples/hero/herofs/import_export_example.vsh
Executable file
@@ -0,0 +1,210 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused run
|
||||
|
||||
import freeflowuniverse.herolib.hero.herofs
|
||||
import os
|
||||
|
||||
// Example demonstrating HeroFS import/export functionality
|
||||
// This shows how to import files from real filesystem to VFS and export them back
|
||||
|
||||
fn main() {
|
||||
// Initialize the HeroFS factory
|
||||
mut fs_factory := herofs.new()!
|
||||
println('HeroFS factory initialized')
|
||||
|
||||
// Create a new filesystem
|
||||
mut my_fs := fs_factory.fs.new(
|
||||
name: 'import_export_demo'
|
||||
description: 'Demonstration filesystem for import/export'
|
||||
quota_bytes: 1024 * 1024 * 1024 // 1GB quota
|
||||
)!
|
||||
|
||||
// Save the filesystem
|
||||
my_fs = fs_factory.fs.set(my_fs)!
|
||||
println('Created filesystem: ${my_fs.name} with ID: ${my_fs.id}')
|
||||
|
||||
// Create root directory
|
||||
mut root_dir := fs_factory.fs_dir.new(
|
||||
name: 'root'
|
||||
fs_id: my_fs.id
|
||||
parent_id: 0 // Root has no parent
|
||||
)!
|
||||
root_dir = fs_factory.fs_dir.set(root_dir)!
|
||||
my_fs.root_dir_id = root_dir.id
|
||||
my_fs = fs_factory.fs.set(my_fs)!
|
||||
|
||||
// Get filesystem instance for operations
|
||||
mut fs := fs_factory.fs.get(my_fs.id)!
|
||||
fs.factory = &fs_factory
|
||||
|
||||
// Create temporary test directory and files on real filesystem
|
||||
test_dir := '/tmp/herofs_import_test_${my_fs.id}'
|
||||
os.mkdir_all(test_dir)!
|
||||
defer {
|
||||
os.rmdir_all(test_dir) or {}
|
||||
}
|
||||
|
||||
// Create test files
|
||||
test_file1 := os.join_path(test_dir, 'hello.txt')
|
||||
test_file2 := os.join_path(test_dir, 'example.v')
|
||||
test_file3 := os.join_path(test_dir, 'README.md')
|
||||
|
||||
// Create subdirectory with files
|
||||
sub_dir := os.join_path(test_dir, 'docs')
|
||||
os.mkdir_all(sub_dir)!
|
||||
test_file4 := os.join_path(sub_dir, 'guide.md')
|
||||
|
||||
// Write test content
|
||||
os.write_file(test_file1, 'Hello, HeroFS Import/Export!')!
|
||||
os.write_file(test_file2, 'fn main() {\n println("Imported V code!")\n}')!
|
||||
os.write_file(test_file3, '# HeroFS Demo\n\nThis file was imported from real filesystem.')!
|
||||
os.write_file(test_file4, '# User Guide\n\nThis is a guide in a subdirectory.')!
|
||||
|
||||
println('\n=== IMPORT OPERATIONS ===')
|
||||
|
||||
// Import single file
|
||||
println('Importing single file: ${test_file1}')
|
||||
fs.import(test_file1, '/imported_hello.txt', herofs.ImportOptions{
|
||||
overwrite: true
|
||||
preserve_meta: true
|
||||
})!
|
||||
|
||||
// Import entire directory recursively
|
||||
println('Importing directory: ${test_dir}')
|
||||
fs.import(test_dir, '/imported_files', herofs.ImportOptions{
|
||||
recursive: true
|
||||
overwrite: true
|
||||
preserve_meta: true
|
||||
})!
|
||||
|
||||
// Verify imports
|
||||
println('\nVerifying imported files...')
|
||||
imported_results := fs.find('/', recursive: true)!
|
||||
for result in imported_results {
|
||||
type_str := match result.result_type {
|
||||
.file { 'FILE' }
|
||||
.directory { 'DIR ' }
|
||||
.symlink { 'LINK' }
|
||||
}
|
||||
println('${type_str}: ${result.path}')
|
||||
}
|
||||
|
||||
// Find specific file types
|
||||
v_files := fs.find('/', recursive: true, include_patterns: ['*.v'])!
|
||||
println('\nFound ${v_files.len} V files:')
|
||||
for file in v_files {
|
||||
println(' - ${file.path}')
|
||||
}
|
||||
|
||||
md_files := fs.find('/', recursive: true, include_patterns: ['*.md'])!
|
||||
println('\nFound ${md_files.len} Markdown files:')
|
||||
for file in md_files {
|
||||
println(' - ${file.path}')
|
||||
}
|
||||
|
||||
println('\n=== EXPORT OPERATIONS ===')
|
||||
|
||||
// Create export directory
|
||||
export_dir := '/tmp/herofs_export_test_${my_fs.id}'
|
||||
os.mkdir_all(export_dir)!
|
||||
defer {
|
||||
os.rmdir_all(export_dir) or {}
|
||||
}
|
||||
|
||||
// Export single file
|
||||
println('Exporting single file to: ${export_dir}/exported_hello.txt')
|
||||
fs.export('/imported_hello.txt', os.join_path(export_dir, 'exported_hello.txt'), herofs.ExportOptions{
|
||||
overwrite: true
|
||||
preserve_meta: true
|
||||
})!
|
||||
|
||||
// Export entire directory
|
||||
println('Exporting directory to: ${export_dir}/exported_files')
|
||||
fs.export('/imported_files', os.join_path(export_dir, 'exported_files'), herofs.ExportOptions{
|
||||
recursive: true
|
||||
overwrite: true
|
||||
preserve_meta: true
|
||||
})!
|
||||
|
||||
// Verify exports
|
||||
println('\nVerifying exported files...')
|
||||
if os.exists(os.join_path(export_dir, 'exported_hello.txt')) {
|
||||
content := os.read_file(os.join_path(export_dir, 'exported_hello.txt'))!
|
||||
println('✓ exported_hello.txt: "${content}"')
|
||||
}
|
||||
|
||||
if os.exists(os.join_path(export_dir, 'exported_files', 'hello.txt')) {
|
||||
content := os.read_file(os.join_path(export_dir, 'exported_files', 'hello.txt'))!
|
||||
println('✓ exported_files/hello.txt: "${content}"')
|
||||
}
|
||||
|
||||
if os.exists(os.join_path(export_dir, 'exported_files', 'example.v')) {
|
||||
content := os.read_file(os.join_path(export_dir, 'exported_files', 'example.v'))!
|
||||
println('✓ exported_files/example.v contains: ${content.split('\n')[0]}')
|
||||
}
|
||||
|
||||
if os.exists(os.join_path(export_dir, 'exported_files', 'docs', 'guide.md')) {
|
||||
content := os.read_file(os.join_path(export_dir, 'exported_files', 'docs', 'guide.md'))!
|
||||
println('✓ exported_files/docs/guide.md: "${content.split('\n')[0]}"')
|
||||
}
|
||||
|
||||
println('\n=== MIME TYPE DETECTION ===')
|
||||
|
||||
// Test MIME type detection
|
||||
test_extensions := ['.txt', '.v', '.md', '.html', '.json', '.png', '.unknown']
|
||||
for ext in test_extensions {
|
||||
mime_type := herofs.extension_to_mime_type(ext)
|
||||
println('Extension ${ext} -> MIME type: ${mime_type}')
|
||||
}
|
||||
|
||||
println('\n=== OVERWRITE BEHAVIOR TEST ===')
|
||||
|
||||
// Test overwrite behavior
|
||||
test_overwrite_file := os.join_path(test_dir, 'overwrite_test.txt')
|
||||
os.write_file(test_overwrite_file, 'Original content')!
|
||||
|
||||
// Import without overwrite
|
||||
fs.import(test_overwrite_file, '/overwrite_test.txt', herofs.ImportOptions{
|
||||
overwrite: false
|
||||
})!
|
||||
|
||||
// Try to import again without overwrite (should fail silently or with error)
|
||||
println('Testing import without overwrite (should fail)...')
|
||||
fs.import(test_overwrite_file, '/overwrite_test.txt', herofs.ImportOptions{
|
||||
overwrite: false
|
||||
}) or { println('✓ Import correctly failed when overwrite=false: ${err}') }
|
||||
|
||||
// Update file content and import with overwrite
|
||||
os.write_file(test_overwrite_file, 'Updated content')!
|
||||
fs.import(test_overwrite_file, '/overwrite_test.txt', herofs.ImportOptions{
|
||||
overwrite: true
|
||||
})!
|
||||
println('✓ Import with overwrite=true succeeded')
|
||||
|
||||
// Test export overwrite behavior
|
||||
export_test_file := os.join_path(export_dir, 'overwrite_export_test.txt')
|
||||
|
||||
// Export first time
|
||||
fs.export('/overwrite_test.txt', export_test_file, herofs.ExportOptions{
|
||||
overwrite: false
|
||||
})!
|
||||
|
||||
// Try to export again without overwrite (should fail)
|
||||
println('Testing export without overwrite (should fail)...')
|
||||
fs.export('/overwrite_test.txt', export_test_file, herofs.ExportOptions{
|
||||
overwrite: false
|
||||
}) or { println('✓ Export correctly failed when overwrite=false: ${err}') }
|
||||
|
||||
// Export with overwrite
|
||||
fs.export('/overwrite_test.txt', export_test_file, herofs.ExportOptions{
|
||||
overwrite: true
|
||||
})!
|
||||
println('✓ Export with overwrite=true succeeded')
|
||||
|
||||
// Verify final content
|
||||
final_content := os.read_file(export_test_file)!
|
||||
println('Final exported content: "${final_content}"')
|
||||
|
||||
println('\n✅ Import/Export demonstration completed successfully!')
|
||||
println('All files have been imported to VFS and exported back to real filesystem.')
|
||||
println('Temporary directories will be cleaned up automatically.')
|
||||
}
|
||||
1
examples/hero/heromodels/.gitignore
vendored
Normal file
1
examples/hero/heromodels/.gitignore
vendored
Normal file
@@ -0,0 +1 @@
|
||||
heroserver_example
|
||||
@@ -1,10 +1,10 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals run
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused run
|
||||
|
||||
import freeflowuniverse.herolib.core.redisclient
|
||||
import freeflowuniverse.herolib.hero.heromodels
|
||||
|
||||
mut mydb := heromodels.new()!
|
||||
// mydb.calendar.db.redis.flushdb()!
|
||||
mydb.calendar.db.redis.flushdb()!
|
||||
|
||||
mut o := mydb.calendar.new(
|
||||
name: 'Work Calendar'
|
||||
@@ -23,10 +23,10 @@ o.tags = mydb.calendar.db.tags_get(['work', 'important'])!
|
||||
// Add comments if needed
|
||||
// o.comments = mydb.calendar.db.comments_get([CommentArg{comment: 'This is a comment'}])!
|
||||
|
||||
oid := mydb.calendar.set(o)!
|
||||
mut o2 := mydb.calendar.get(oid)!
|
||||
mydb.calendar.set(o)!
|
||||
mut o2 := mydb.calendar.get(o.id)!
|
||||
|
||||
println('Calendar ID: ${oid}')
|
||||
println('Calendar ID: ${o.id}')
|
||||
println('Calendar object: ${o2}')
|
||||
|
||||
mut objects := mydb.calendar.list()!
|
||||
|
||||
8
examples/hero/heromodels/heromodels_calendar_event.vsh
Normal file → Executable file
8
examples/hero/heromodels/heromodels_calendar_event.vsh
Normal file → Executable file
@@ -1,4 +1,4 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals run
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused run
|
||||
|
||||
import freeflowuniverse.herolib.core.redisclient
|
||||
import freeflowuniverse.herolib.hero.heromodels
|
||||
@@ -31,10 +31,10 @@ o.tags = mydb.calendar_event.db.tags_get(['work', 'meeting', 'team'])!
|
||||
// Add comments if needed
|
||||
// o.comments = mydb.calendar_event.db.comments_get([CommentArg{comment: 'This is a comment'}])!
|
||||
|
||||
oid := mydb.calendar_event.set(o)!
|
||||
mut o2 := mydb.calendar_event.get(oid)!
|
||||
mydb.calendar_event.set(o)!
|
||||
mut o2 := mydb.calendar_event.get(o.id)!
|
||||
|
||||
println('Calendar Event ID: ${oid}')
|
||||
println('Calendar Event ID: ${o.id}')
|
||||
println('Calendar Event object: ${o2}')
|
||||
|
||||
mut objects := mydb.calendar_event.list()!
|
||||
|
||||
8
examples/hero/heromodels/heromodels_calendar_event_simple.vsh
Normal file → Executable file
8
examples/hero/heromodels/heromodels_calendar_event_simple.vsh
Normal file → Executable file
@@ -1,4 +1,4 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals run
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused run
|
||||
|
||||
import freeflowuniverse.herolib.core.redisclient
|
||||
import freeflowuniverse.herolib.hero.heromodels
|
||||
@@ -31,8 +31,8 @@ o.tags = mydb.calendar_event.db.tags_get(['work', 'meeting', 'team'])!
|
||||
// Add comments if needed
|
||||
// o.comments = mydb.calendar_event.db.comments_get([CommentArg{comment: 'This is a comment'}])!
|
||||
|
||||
oid := mydb.calendar_event.set(o)!
|
||||
mut o2 := mydb.calendar_event.get(oid)!
|
||||
mydb.calendar_event.set(o)!
|
||||
mut o2 := mydb.calendar_event.get(o.id)!
|
||||
|
||||
println('Calendar Event ID: ${oid}')
|
||||
println('Calendar Event ID: ${o.id}')
|
||||
println('Calendar Event object: ${o2}')
|
||||
|
||||
8
examples/hero/heromodels/heromodels_calendar_event_with_recurrence.vsh
Normal file → Executable file
8
examples/hero/heromodels/heromodels_calendar_event_with_recurrence.vsh
Normal file → Executable file
@@ -1,4 +1,4 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals run
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused run
|
||||
|
||||
import freeflowuniverse.herolib.core.redisclient
|
||||
import freeflowuniverse.herolib.hero.heromodels
|
||||
@@ -41,10 +41,10 @@ o.tags = mydb.calendar_event.db.tags_get(['work', 'meeting', 'team'])!
|
||||
// Add comments if needed
|
||||
// o.comments = mydb.calendar_event.db.comments_get([CommentArg{comment: 'This is a comment'}])!
|
||||
|
||||
oid := mydb.calendar_event.set(o)!
|
||||
mut o2 := mydb.calendar_event.get(oid)!
|
||||
mydb.calendar_event.set(o)!
|
||||
mut o2 := mydb.calendar_event.get(o.id)!
|
||||
|
||||
println('Calendar Event ID: ${oid}')
|
||||
println('Calendar Event ID: ${o.id}')
|
||||
println('Calendar Event object: ${o2}')
|
||||
|
||||
mut objects := mydb.calendar_event.list()!
|
||||
|
||||
12
examples/hero/heromodels/heromodels_chat_group.vsh
Normal file → Executable file
12
examples/hero/heromodels/heromodels_chat_group.vsh
Normal file → Executable file
@@ -1,4 +1,4 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals run
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused run
|
||||
|
||||
import freeflowuniverse.herolib.core.redisclient
|
||||
import freeflowuniverse.herolib.hero.heromodels
|
||||
@@ -15,11 +15,11 @@ mut chat_group := mydb.chat_group.new(
|
||||
)!
|
||||
|
||||
// Save to database
|
||||
oid := mydb.chat_group.set(chat_group)!
|
||||
println('Created chat group with ID: ${oid}')
|
||||
mydb.chat_group.set(mut chat_group)!
|
||||
println('Created chat group with ID: ${chat_group.id}')
|
||||
|
||||
// Retrieve from database
|
||||
mut chat_group2 := mydb.chat_group.get(oid)!
|
||||
mut chat_group2 := mydb.chat_group.get(chat_group.id)!
|
||||
println('Retrieved chat group: ${chat_group2}')
|
||||
|
||||
// List all chat groups
|
||||
@@ -29,8 +29,8 @@ println('All chat groups: ${chat_groups}')
|
||||
// Update the chat group
|
||||
chat_group2.is_archived = true
|
||||
chat_group2.last_activity = 1672531200
|
||||
mydb.chat_group.set(chat_group2)!
|
||||
mydb.chat_group.set(mut chat_group2)!
|
||||
|
||||
// Retrieve updated chat group
|
||||
mut chat_group3 := mydb.chat_group.get(oid)!
|
||||
mut chat_group3 := mydb.chat_group.get(chat_group2.id)!
|
||||
println('Updated chat group: ${chat_group3}')
|
||||
|
||||
17
examples/hero/heromodels/heromodels_chat_message.vsh
Normal file → Executable file
17
examples/hero/heromodels/heromodels_chat_message.vsh
Normal file → Executable file
@@ -1,4 +1,4 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals run
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused run
|
||||
|
||||
import freeflowuniverse.herolib.core.redisclient
|
||||
import freeflowuniverse.herolib.hero.heromodels
|
||||
@@ -13,14 +13,15 @@ mut chat_group := mydb.chat_group.new(
|
||||
last_activity: 0
|
||||
is_archived: false
|
||||
)!
|
||||
chat_group_id := mydb.chat_group.set(chat_group)!
|
||||
|
||||
mydb.chat_group.set(mut chat_group)!
|
||||
|
||||
// Create a new chat message
|
||||
mut chat_message := mydb.chat_message.new(
|
||||
name: 'Hello World Message'
|
||||
description: 'A simple hello world message'
|
||||
content: 'Hello, world!'
|
||||
chat_group_id: chat_group_id
|
||||
chat_group_id: chat_group.id
|
||||
sender_id: 1
|
||||
parent_messages: []
|
||||
fs_files: []
|
||||
@@ -31,11 +32,11 @@ mut chat_message := mydb.chat_message.new(
|
||||
)!
|
||||
|
||||
// Save to database
|
||||
oid := mydb.chat_message.set(chat_message)!
|
||||
println('Created chat message with ID: ${oid}')
|
||||
mydb.chat_message.set(mut chat_message)!
|
||||
println('Created chat message with ID: ${chat_message.id}')
|
||||
|
||||
// Retrieve from database
|
||||
mut chat_message2 := mydb.chat_message.get(oid)!
|
||||
mut chat_message2 := mydb.chat_message.get(chat_message.id)!
|
||||
println('Retrieved chat message: ${chat_message2}')
|
||||
|
||||
// List all chat messages
|
||||
@@ -44,8 +45,8 @@ println('All chat messages: ${chat_messages}')
|
||||
|
||||
// Update the chat message
|
||||
chat_message2.status = .read
|
||||
mydb.chat_message.set(chat_message2)!
|
||||
mydb.chat_message.set(mut chat_message2)!
|
||||
|
||||
// Retrieve updated chat message
|
||||
mut chat_message3 := mydb.chat_message.get(oid)!
|
||||
mut chat_message3 := mydb.chat_message.get(chat_message.id)!
|
||||
println('Updated chat message: ${chat_message3}')
|
||||
|
||||
@@ -1,20 +1,20 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals run
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused run
|
||||
|
||||
import freeflowuniverse.herolib.core.redisclient
|
||||
import freeflowuniverse.herolib.hero.heromodels
|
||||
|
||||
mut mydb := heromodels.new()!
|
||||
// mydb.comments.db.redis.flushdb()!
|
||||
mydb.comments.db.redis.flushdb()!
|
||||
|
||||
mut o := mydb.comments.new(comment: 'Hello, world!')!
|
||||
|
||||
o.tags = mydb.comments.db.tags_get(['tag1', 'tag2'])!
|
||||
|
||||
oid := mydb.comments.set(o)!
|
||||
mut o2 := mydb.comments.get(oid)!
|
||||
mydb.comments.set(o)!
|
||||
mut o2 := mydb.comments.get(o.id)!
|
||||
|
||||
println(oid)
|
||||
println(o2)
|
||||
println('Comment ID: ${o.id}')
|
||||
println('Comment object: ${o2}')
|
||||
|
||||
mut objects := mydb.comments.list()!
|
||||
println(objects)
|
||||
// mut objects := mydb.comments.list()!
|
||||
// println(objects)
|
||||
|
||||
16
examples/hero/heromodels/heromodels_group.vsh
Normal file → Executable file
16
examples/hero/heromodels/heromodels_group.vsh
Normal file → Executable file
@@ -1,4 +1,4 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals run
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused run
|
||||
|
||||
import freeflowuniverse.herolib.hero.heromodels
|
||||
|
||||
@@ -18,15 +18,15 @@ mut o := mydb.group.new(
|
||||
o.tags = mydb.group.db.tags_get(['team', 'development'])!
|
||||
|
||||
// Save to database
|
||||
oid := mydb.group.set(o)!
|
||||
println('Created Group ID: ${oid}')
|
||||
mydb.group.set(o)!
|
||||
println('Created Group ID: ${o.id}')
|
||||
|
||||
// Check if the group exists
|
||||
mut exists := mydb.group.exist(oid)!
|
||||
mut exists := mydb.group.exist(o.id)!
|
||||
println('Group exists: ${exists}')
|
||||
|
||||
// Retrieve from database
|
||||
mut o2 := mydb.group.get(oid)!
|
||||
mut o2 := mydb.group.get(o.id)!
|
||||
println('Retrieved Group object: ${o2}')
|
||||
|
||||
// List all groups
|
||||
@@ -34,9 +34,9 @@ mut objects := mydb.group.list()!
|
||||
println('All groups: ${objects}')
|
||||
|
||||
// Delete the group
|
||||
mydb.group.delete(oid)!
|
||||
println('Deleted group with ID: ${oid}')
|
||||
mydb.group.delete(o.id)!
|
||||
println('Deleted group with ID: ${o.id}')
|
||||
|
||||
// Check if the group still exists
|
||||
exists = mydb.group.exist(oid)!
|
||||
exists = mydb.group.exist(o.id)!
|
||||
println('Group exists after deletion: ${exists}')
|
||||
|
||||
8
examples/hero/heromodels/heromodels_group_add_members.vsh
Normal file → Executable file
8
examples/hero/heromodels/heromodels_group_add_members.vsh
Normal file → Executable file
@@ -1,4 +1,4 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals run
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused run
|
||||
|
||||
import freeflowuniverse.herolib.hero.heromodels
|
||||
|
||||
@@ -23,11 +23,11 @@ o.add_member(3, heromodels.GroupRole.reader)
|
||||
o.tags = mydb.group.db.tags_get(['team', 'marketing'])!
|
||||
|
||||
// Save to database
|
||||
oid := mydb.group.set(o)!
|
||||
println('Created Group ID: ${oid}')
|
||||
mydb.group.set(o)!
|
||||
println('Created Group ID: ${o.id}')
|
||||
|
||||
// Retrieve from database
|
||||
mut o2 := mydb.group.get(oid)!
|
||||
mut o2 := mydb.group.get(o.id)!
|
||||
println('Retrieved Group object: ${o2}')
|
||||
|
||||
// Check the number of members
|
||||
|
||||
22
examples/hero/heromodels/heromodels_group_relationships.vsh
Normal file → Executable file
22
examples/hero/heromodels/heromodels_group_relationships.vsh
Normal file → Executable file
@@ -1,4 +1,4 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals run
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused run
|
||||
|
||||
import freeflowuniverse.herolib.hero.heromodels
|
||||
|
||||
@@ -14,8 +14,8 @@ mut parent_group := mydb.group.new(
|
||||
parent_group: 0
|
||||
)!
|
||||
|
||||
parent_oid := mydb.group.set(parent_group)!
|
||||
println('Created Parent Group ID: ${parent_oid}')
|
||||
mydb.group.set(mut parent_group)!
|
||||
println('Created Parent Group ID: ${parent_group.id}')
|
||||
|
||||
// Create a subgroup
|
||||
mut subgroup := mydb.group.new(
|
||||
@@ -24,20 +24,20 @@ mut subgroup := mydb.group.new(
|
||||
is_public: false
|
||||
members: []
|
||||
subgroups: []
|
||||
parent_group: parent_oid
|
||||
parent_group: parent_group.id
|
||||
)!
|
||||
|
||||
sub_oid := mydb.group.set(subgroup)!
|
||||
println('Created Subgroup ID: ${sub_oid}')
|
||||
mydb.group.set(mut subgroup)!
|
||||
println('Created Subgroup ID: ${subgroup.id}')
|
||||
|
||||
// Update the parent group to include the subgroup
|
||||
mut updated_parent := mydb.group.get(parent_oid)!
|
||||
updated_parent.subgroups = [sub_oid]
|
||||
mydb.group.set(updated_parent)!
|
||||
mut updated_parent := mydb.group.get(parent_group.id)!
|
||||
updated_parent.subgroups = [subgroup.id]
|
||||
mydb.group.set(mut updated_parent)!
|
||||
|
||||
// Retrieve both groups to verify relationships
|
||||
mut parent_from_db := mydb.group.get(parent_oid)!
|
||||
mut sub_from_db := mydb.group.get(sub_oid)!
|
||||
mut parent_from_db := mydb.group.get(parent_group.id)!
|
||||
mut sub_from_db := mydb.group.get(subgroup.id)!
|
||||
|
||||
println('Parent Group: ${parent_from_db}')
|
||||
println('Subgroup: ${sub_from_db}')
|
||||
|
||||
10
examples/hero/heromodels/heromodels_group_with_members.vsh
Normal file → Executable file
10
examples/hero/heromodels/heromodels_group_with_members.vsh
Normal file → Executable file
@@ -1,4 +1,4 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals run
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused run
|
||||
|
||||
import freeflowuniverse.herolib.hero.heromodels
|
||||
|
||||
@@ -29,15 +29,15 @@ mut o := mydb.group.new(
|
||||
o.tags = mydb.group.db.tags_get(['team', 'development'])!
|
||||
|
||||
// Save to database
|
||||
oid := mydb.group.set(o)!
|
||||
println('Created Group ID: ${oid}')
|
||||
mydb.group.set(o)!
|
||||
println('Created Group ID: ${o.id}')
|
||||
|
||||
// Check if the group exists
|
||||
mut exists := mydb.group.exist(oid)!
|
||||
mut exists := mydb.group.exist(o.id)!
|
||||
println('Group exists: ${exists}')
|
||||
|
||||
// Retrieve from database
|
||||
mut o2 := mydb.group.get(oid)!
|
||||
mut o2 := mydb.group.get(o.id)!
|
||||
println('Retrieved Group object: ${o2}')
|
||||
|
||||
// List all groups
|
||||
|
||||
14
examples/hero/heromodels/heromodels_project.vsh
Normal file → Executable file
14
examples/hero/heromodels/heromodels_project.vsh
Normal file → Executable file
@@ -1,4 +1,4 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl,enable_globals run
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused run
|
||||
|
||||
import freeflowuniverse.herolib.hero.heromodels
|
||||
import freeflowuniverse.herolib.hero.db
|
||||
@@ -66,11 +66,11 @@ mut project := mydb.project.new(
|
||||
)!
|
||||
|
||||
// Save the project to the database
|
||||
project_id := mydb.project.set(project)!
|
||||
println('Created project with ID: ${project_id}')
|
||||
mydb.project.set(mut project)!
|
||||
println('Created project with ID: ${project.id}')
|
||||
|
||||
// Retrieve the project from the database
|
||||
mut retrieved_project := mydb.project.get(project_id)!
|
||||
mut retrieved_project := mydb.project.get(project.id)!
|
||||
println('Retrieved project: ${retrieved_project}')
|
||||
|
||||
// List all projects
|
||||
@@ -78,13 +78,13 @@ mut all_projects := mydb.project.list()!
|
||||
println('All projects: ${all_projects}')
|
||||
|
||||
// Check if the project exists
|
||||
exists := mydb.project.exist(project_id)!
|
||||
mut exists := mydb.project.exist(project.id)!
|
||||
println('Project exists: ${exists}')
|
||||
|
||||
// Delete the project
|
||||
mydb.project.delete(project_id)!
|
||||
mydb.project.delete(project.id)!
|
||||
println('Project deleted')
|
||||
|
||||
// Check if the project still exists
|
||||
exists = mydb.project.exist(project_id)!
|
||||
exists = mydb.project.exist(project.id)!
|
||||
println('Project exists after deletion: ${exists}')
|
||||
|
||||
55
examples/hero/heromodels/heromodels_project_issue.vsh
Normal file → Executable file
55
examples/hero/heromodels/heromodels_project_issue.vsh
Normal file → Executable file
@@ -1,4 +1,4 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl,enable_globals run
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused run
|
||||
|
||||
import freeflowuniverse.herolib.hero.heromodels
|
||||
import freeflowuniverse.herolib.hero.db
|
||||
@@ -6,12 +6,51 @@ import freeflowuniverse.herolib.hero.db
|
||||
mut mydb := heromodels.new()!
|
||||
mydb.project_issue.db.redis.flushdb()!
|
||||
|
||||
// Create swimlanes
|
||||
swimlane := heromodels.Swimlane{
|
||||
name: 'todo'
|
||||
description: 'Tasks to be done'
|
||||
order: 1
|
||||
color: '#FF0000'
|
||||
is_done: false
|
||||
}
|
||||
|
||||
// Create milestones
|
||||
milestone := heromodels.Milestone{
|
||||
name: 'phase_1'
|
||||
description: 'First development phase'
|
||||
due_date: 1672531200 // 2023-01-01
|
||||
completed: false
|
||||
issues: [u32(1), u32(2)]
|
||||
}
|
||||
|
||||
// Create a new project
|
||||
mut project := mydb.project.new(
|
||||
name: 'Sample Project'
|
||||
description: 'A sample project for demonstration'
|
||||
swimlanes: [swimlane]
|
||||
milestones: [milestone]
|
||||
issues: ['issue1', 'issue2', 'issue3']
|
||||
fs_files: [u32(100), u32(200)]
|
||||
status: .active
|
||||
start_date: '2023-01-01'
|
||||
end_date: '2023-12-31'
|
||||
tags: ['sample', 'demo', 'project']
|
||||
comments: [db.CommentArg{
|
||||
comment: 'This is a sample project'
|
||||
}]
|
||||
)!
|
||||
|
||||
// Save the project to the database
|
||||
mydb.project.set(mut project)!
|
||||
println('Created project with ID: ${project.id}')
|
||||
|
||||
// Create a new project issue
|
||||
mut issue := mydb.project_issue.new(
|
||||
name: 'Fix login bug'
|
||||
description: 'Users are unable to login with their credentials'
|
||||
title: 'Login functionality is broken'
|
||||
project_id: u32(1)
|
||||
project_id: project.id
|
||||
issue_type: .bug
|
||||
priority: .high
|
||||
status: .open
|
||||
@@ -36,11 +75,11 @@ mut issue := mydb.project_issue.new(
|
||||
)!
|
||||
|
||||
// Save the issue to the database
|
||||
issue_id := mydb.project_issue.set(issue)!
|
||||
println('Created project issue with ID: ${issue_id}')
|
||||
mydb.project_issue.set(mut issue)!
|
||||
println('Created project issue with ID: ${issue.id}')
|
||||
|
||||
// Retrieve the issue from the database
|
||||
mut retrieved_issue := mydb.project_issue.get(issue_id)!
|
||||
mut retrieved_issue := mydb.project_issue.get(issue.id)!
|
||||
println('Retrieved project issue: ${retrieved_issue}')
|
||||
|
||||
// List all project issues
|
||||
@@ -48,13 +87,13 @@ mut all_issues := mydb.project_issue.list()!
|
||||
println('All project issues: ${all_issues}')
|
||||
|
||||
// Check if the issue exists
|
||||
exists := mydb.project_issue.exist(issue_id)!
|
||||
mut exists := mydb.project_issue.exist(issue.id)!
|
||||
println('Project issue exists: ${exists}')
|
||||
|
||||
// Delete the issue
|
||||
mydb.project_issue.delete(issue_id)!
|
||||
mydb.project_issue.delete(issue.id)!
|
||||
println('Project issue deleted')
|
||||
|
||||
// Check if the issue still exists
|
||||
exists = mydb.project_issue.exist(issue_id)!
|
||||
exists = mydb.project_issue.exist(issue.id)!
|
||||
println('Project issue exists after deletion: ${exists}')
|
||||
|
||||
16
examples/hero/heromodels/heromodels_user.vsh
Normal file → Executable file
16
examples/hero/heromodels/heromodels_user.vsh
Normal file → Executable file
@@ -1,4 +1,4 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals run
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused run
|
||||
|
||||
import freeflowuniverse.herolib.hero.heromodels
|
||||
|
||||
@@ -22,15 +22,15 @@ mut o := mydb.user.new(
|
||||
)!
|
||||
|
||||
// Save to database
|
||||
oid := mydb.user.set(o)!
|
||||
println('Created User ID: ${oid}')
|
||||
mydb.user.set(o)!
|
||||
println('Created User ID: ${o.id}')
|
||||
|
||||
// Check if the user exists
|
||||
mut exists := mydb.user.exist(oid)!
|
||||
mut exists := mydb.user.exist(o.id)!
|
||||
println('User exists: ${exists}')
|
||||
|
||||
// Retrieve from database
|
||||
mut o2 := mydb.user.get(oid)!
|
||||
mut o2 := mydb.user.get(o.id)!
|
||||
println('Retrieved User object: ${o2}')
|
||||
|
||||
// List all users
|
||||
@@ -38,9 +38,9 @@ mut objects := mydb.user.list()!
|
||||
println('All users: ${objects}')
|
||||
|
||||
// Delete the user
|
||||
mydb.user.delete(oid)!
|
||||
println('Deleted user with ID: ${oid}')
|
||||
mydb.user.delete(o.id)!
|
||||
println('Deleted user with ID: ${o.id}')
|
||||
|
||||
// Check if the user still exists
|
||||
exists = mydb.user.exist(oid)!
|
||||
exists = mydb.user.exist(o.id)!
|
||||
println('User exists after deletion: ${exists}')
|
||||
|
||||
27
examples/hero/heromodels/heroserver_example.vsh
Executable file
27
examples/hero/heromodels/heroserver_example.vsh
Executable file
@@ -0,0 +1,27 @@
|
||||
#!/usr/bin/env -S v -n -w -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused run
|
||||
|
||||
import freeflowuniverse.herolib.hero.heromodels
|
||||
import freeflowuniverse.herolib.hero.db
|
||||
import time
|
||||
|
||||
fn main() {
|
||||
// Start the server in a background thread with authentication disabled for testing
|
||||
spawn fn () ! {
|
||||
heromodels.new(reset: true, name: 'test')!
|
||||
heromodels.server_start(
|
||||
name: 'test'
|
||||
port: 8080
|
||||
auth_enabled: false // Disable auth for testing
|
||||
cors_enabled: true
|
||||
reset: true
|
||||
allowed_origins: [
|
||||
'http://localhost:5173',
|
||||
]
|
||||
) or { panic('Failed to start HeroModels server: ${err}') }
|
||||
}()
|
||||
|
||||
// Keep the main thread alive
|
||||
for {
|
||||
time.sleep(time.second)
|
||||
}
|
||||
}
|
||||
@@ -1,25 +0,0 @@
|
||||
#!/usr/bin/env -S v -n -w -gc none -cc tcc -d use_openssl -enable-globals run
|
||||
|
||||
import json
|
||||
import freeflowuniverse.herolib.hero.heromodels.openrpc
|
||||
import freeflowuniverse.herolib.schemas.jsonrpc
|
||||
import freeflowuniverse.herolib.hero.heromodels
|
||||
|
||||
fn main() {
|
||||
mut handler := openrpc.new_heromodels_handler()!
|
||||
|
||||
my_calendar := heromodels.calendar_new(
|
||||
name: 'My Calendar'
|
||||
description: 'My Calendar'
|
||||
securitypolicy: 1
|
||||
tags: ['tag1', 'tag2']
|
||||
group_id: 1
|
||||
events: []u32{}
|
||||
color: '#000000'
|
||||
timezone: 'UTC'
|
||||
is_public: true
|
||||
)!
|
||||
|
||||
response := handler.handle(jsonrpc.new_request('calendar_set', json.encode(my_calendar)))!
|
||||
println(response)
|
||||
}
|
||||
@@ -1,12 +1,17 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals run
|
||||
|
||||
import freeflowuniverse.herolib.hero.heromodels.rpc
|
||||
import freeflowuniverse.herolib.ui.console
|
||||
|
||||
// when httpport is set, the rpc will be available over http
|
||||
// if 0, then its only available on unix socket /tmp/heromodels
|
||||
mut http_port := 9933
|
||||
|
||||
if http_port == 0 {
|
||||
println('
|
||||
console.print_header('
|
||||
|
||||
#to test the discover function:
|
||||
|
||||
echo \'\{"jsonrpc":"2.0","method":"rpc.discover","params":[],"id":1\}\' \\
|
||||
| nc -U /tmp/heromodels
|
||||
\'
|
||||
@@ -14,13 +19,23 @@ if http_port == 0 {
|
||||
|
||||
nc -U /tmp/heromodels
|
||||
|
||||
then e.g. do
|
||||
then e.g. do
|
||||
|
||||
\{"jsonrpc":"2.0","method":"comment_set","params":{"comment":"Hello world!","parent":0,"author":42},"id":1\}
|
||||
|
||||
needs to be on one line for openrpc to work
|
||||
|
||||
')
|
||||
} else {
|
||||
console.print_header('
|
||||
|
||||
#to test the discover function:
|
||||
|
||||
curl -X POST -H "Content-Type: application/json" -d \'\{"jsonrpc":"2.0","method":"rpc.discover","id":1,"params":[]\}\' http://localhost:9933
|
||||
|
||||
curl -X POST -H "Content-Type: application/json" -d \'\{"jsonrpc":"2.0","method":"comment_set","params":{"comment":"Hello world!","parent":0,"author":42},"id":1\}\' http://localhost:9933
|
||||
|
||||
')
|
||||
}
|
||||
|
||||
rpc.start(http_port: http_port)!
|
||||
rpc.start(port: http_port)!
|
||||
|
||||
1
examples/hero/heroserver/.gitignore
vendored
Normal file
1
examples/hero/heroserver/.gitignore
vendored
Normal file
@@ -0,0 +1 @@
|
||||
heroserver
|
||||
@@ -1,6 +1,21 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals run
|
||||
|
||||
import freeflowuniverse.herolib.hero.heroserver
|
||||
import freeflowuniverse.herolib.schemas.openrpc
|
||||
import os
|
||||
|
||||
// 1. Create a new server instance
|
||||
mut server := heroserver.new(port: 8080)!
|
||||
|
||||
// 2. Create and register your OpenRPC handlers
|
||||
// These handlers must conform to the `openrpc.OpenRPCHandler` interface.
|
||||
script_dir := os.dir(@FILE)
|
||||
openrpc_path := os.join_path(script_dir, 'openrpc.json')
|
||||
handler := openrpc.new_handler(openrpc_path)!
|
||||
server.register_handler('comments', handler)!
|
||||
|
||||
println('Server starting on http://localhost:8080')
|
||||
println('Documentation available at: http://localhost:8080/doc/comments/')
|
||||
println('Comments API available at: http://localhost:8080/api/comments')
|
||||
|
||||
mut server := heroserver.new_server(port: 8080)!
|
||||
server.start()!
|
||||
|
||||
@@ -1,23 +0,0 @@
|
||||
import freeflowuniverse.herolib.hero.heroserver
|
||||
import freeflowuniverse.herolib.schemas.openrpc
|
||||
|
||||
fn testsuite_begin() {
|
||||
// a clean start
|
||||
// os.rm('./db')! //TODO: was giving issues
|
||||
}
|
||||
|
||||
fn test_heroserver_new() {
|
||||
// Create server
|
||||
mut server := heroserver.new_server(port: 8080)!
|
||||
|
||||
// Register handlers
|
||||
spec := openrpc.from_file('./openrpc.json')!
|
||||
handler := openrpc.new_handler(spec)
|
||||
|
||||
server.handler_registry.register('comments', handler, spec)
|
||||
|
||||
// Start server
|
||||
go server.start()
|
||||
|
||||
assert true
|
||||
}
|
||||
@@ -26,6 +26,43 @@
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"name": "get_comment",
|
||||
"summary": "Get a comment by ID",
|
||||
"description": "Retrieves a specific comment using its unique identifier.",
|
||||
"params": [
|
||||
{
|
||||
"name": "id",
|
||||
"description": "The unique identifier of the comment to retrieve.",
|
||||
"required": true,
|
||||
"schema": {
|
||||
"type": "number",
|
||||
"example": "1"
|
||||
}
|
||||
},
|
||||
{
|
||||
"name": "include_metadata",
|
||||
"description": "Whether to include metadata in the response.",
|
||||
"required": false,
|
||||
"schema": {
|
||||
"type": "boolean",
|
||||
"example": true
|
||||
}
|
||||
}
|
||||
],
|
||||
"result": {
|
||||
"name": "comment",
|
||||
"description": "The requested comment object.",
|
||||
"schema": {
|
||||
"type": "object",
|
||||
"example": {
|
||||
"id": 1,
|
||||
"text": "This is a sample comment",
|
||||
"created_at": "2024-01-15T10:30:00Z"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
],
|
||||
"components": {}
|
||||
|
||||
16
install_v.sh
16
install_v.sh
@@ -1,4 +1,8 @@
|
||||
#!/bin/bash -e
|
||||
#!/bin/bash
|
||||
|
||||
set -euo pipefail
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
cd "$SCRIPT_DIR"
|
||||
|
||||
# Help function
|
||||
print_help() {
|
||||
@@ -188,7 +192,7 @@ function package_install {
|
||||
is_github_actions() {
|
||||
# echo "Checking GitHub Actions environment..."
|
||||
# echo "GITHUB_ACTIONS=${GITHUB_ACTIONS:-not set}"
|
||||
if [ -n "$GITHUB_ACTIONS" ] && [ "$GITHUB_ACTIONS" = "true" ]; then
|
||||
if [ -n "${GITHUB_ACTIONS:-}" ] && [ "$GITHUB_ACTIONS" = "true" ]; then
|
||||
echo "Running in GitHub Actions: true"
|
||||
return 0
|
||||
else
|
||||
@@ -204,7 +208,7 @@ function myplatform {
|
||||
elif [ -e /etc/os-release ]; then
|
||||
# Read the ID field from the /etc/os-release file
|
||||
export OSNAME=$(grep '^ID=' /etc/os-release | cut -d= -f2)
|
||||
if [ "${os_id,,}" == "ubuntu" ]; then
|
||||
if [ "${OSNAME,,}" == "ubuntu" ]; then
|
||||
export OSNAME="ubuntu"
|
||||
fi
|
||||
if [ "${OSNAME}" == "archarm" ]; then
|
||||
@@ -308,13 +312,13 @@ function hero_lib_pull {
|
||||
|
||||
function hero_lib_get {
|
||||
|
||||
mkdir -p $DIR_CODE/github/freeflowuniverse
|
||||
mkdir -p $DIR_CODE/github/incubaid
|
||||
if [[ -d "$DIR_CODE/github/incubaid/herolib" ]]
|
||||
then
|
||||
hero_lib_pull
|
||||
else
|
||||
pushd $DIR_CODE/github/freeflowuniverse 2>&1 >> /dev/null
|
||||
git clone --depth 1 --no-single-branch https://github.com/incubaid/herolib.git
|
||||
pushd $DIR_CODE/github/incubaid 2>&1 >> /dev/null
|
||||
git clone --depth 1 --no-single-branch https://github.com/freeflowuniverse/herolib.git
|
||||
popd 2>&1 >> /dev/null
|
||||
fi
|
||||
}
|
||||
|
||||
@@ -1,3 +0,0 @@
|
||||
|
||||
|
||||
If logic is implemented in mcp module, than structure with folders logic and mcp, where logic residers in /logic and mcp related code (like tool and prompt handlers and server code) in /mcp
|
||||
37
lib/ai/mcp/handler_logging.v
Normal file
37
lib/ai/mcp/handler_logging.v
Normal file
@@ -0,0 +1,37 @@
|
||||
module mcp
|
||||
|
||||
import x.json2
|
||||
import freeflowuniverse.herolib.schemas.jsonrpc
|
||||
|
||||
// LogLevel represents the logging levels supported by MCP
|
||||
pub enum LogLevel {
|
||||
debug
|
||||
info
|
||||
notice
|
||||
warning
|
||||
error
|
||||
critical
|
||||
alert
|
||||
emergency
|
||||
}
|
||||
|
||||
// SetLevelParams represents the parameters for the logging/setLevel method
|
||||
pub struct SetLevelParams {
|
||||
pub:
|
||||
level LogLevel
|
||||
}
|
||||
|
||||
// logging_set_level_handler handles the logging/setLevel request
|
||||
// This is a stub implementation that accepts the request but doesn't actually change logging behavior
|
||||
pub fn logging_set_level_handler(data string) !string {
|
||||
// Decode the request with SetLevelParams
|
||||
request := jsonrpc.decode_request_generic[SetLevelParams](data)!
|
||||
|
||||
// For now, we just acknowledge the request without actually implementing logging level changes
|
||||
// In a full implementation, this would configure the server's logging system
|
||||
|
||||
// Create a success response with empty object (logging/setLevel returns {} on success)
|
||||
empty_map := map[string]string{}
|
||||
response := jsonrpc.new_response_generic[map[string]string](request.id, empty_map)
|
||||
return response.encode()
|
||||
}
|
||||
30
lib/ai/mcp/mcp.v
Normal file
30
lib/ai/mcp/mcp.v
Normal file
@@ -0,0 +1,30 @@
|
||||
module mcp
|
||||
|
||||
import freeflowuniverse.herolib.ai.mcpcore
|
||||
|
||||
// Re-export the main types from mcpcore
|
||||
pub type Server = mcpcore.Server
|
||||
pub type Backend = mcpcore.Backend
|
||||
pub type MemoryBackend = mcpcore.MemoryBackend
|
||||
pub type ServerConfiguration = mcpcore.ServerConfiguration
|
||||
pub type ServerInfo = mcpcore.ServerInfo
|
||||
pub type ServerParams = mcpcore.ServerParams
|
||||
pub type Tool = mcpcore.Tool
|
||||
pub type ToolContent = mcpcore.ToolContent
|
||||
pub type ToolCallResult = mcpcore.ToolCallResult
|
||||
|
||||
// Re-export the main functions from mcpcore
|
||||
pub fn new_server(backend Backend, params ServerParams) !&mcpcore.Server {
|
||||
return mcpcore.new_server(backend, params)
|
||||
}
|
||||
|
||||
// Re-export helper functions from mcpcore
|
||||
pub fn result_to_mcp_tool_contents[T](result T) []ToolContent {
|
||||
return mcpcore.result_to_mcp_tool_contents[T](result)
|
||||
}
|
||||
|
||||
pub fn result_to_mcp_tool_content[T](result T) ToolContent {
|
||||
return mcpcore.result_to_mcp_tool_content[T](result)
|
||||
}
|
||||
|
||||
// Note: LogLevel and SetLevelParams are already defined in handler_logging.v
|
||||
@@ -1,6 +1,6 @@
|
||||
module mcpgen
|
||||
|
||||
import freeflowuniverse.herolib.core.code
|
||||
import freeflowuniverse.herolib.develop.codetools as code
|
||||
import freeflowuniverse.herolib.ai.mcp
|
||||
import freeflowuniverse.herolib.schemas.jsonschema
|
||||
import freeflowuniverse.herolib.schemas.jsonschema.codegen
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
module mcpgen
|
||||
|
||||
import freeflowuniverse.herolib.ai.mcp
|
||||
import freeflowuniverse.herolib.core.code
|
||||
import freeflowuniverse.herolib.develop.codetools as code
|
||||
import freeflowuniverse.herolib.schemas.jsonschema
|
||||
import x.json2 as json { Any }
|
||||
// import json
|
||||
|
||||
@@ -1,13 +1,12 @@
|
||||
module mcpgen
|
||||
|
||||
import freeflowuniverse.herolib.ai.mcp.logger
|
||||
import freeflowuniverse.herolib.ai.mcp
|
||||
|
||||
@[heap]
|
||||
pub struct MCPGen {}
|
||||
|
||||
pub fn new_mcp_server(v &MCPGen) !&mcp.Server {
|
||||
logger.info('Creating new Developer MCP server')
|
||||
// Note: Removed logger.info() as it interferes with STDIO transport JSON-RPC communication
|
||||
|
||||
// Initialize the server with the empty handlers map
|
||||
mut server := mcp.new_server(mcp.MemoryBackend{
|
||||
|
||||
@@ -1,56 +0,0 @@
|
||||
module mcp
|
||||
|
||||
import time
|
||||
import os
|
||||
import log
|
||||
import x.json2
|
||||
import freeflowuniverse.herolib.schemas.jsonrpc
|
||||
|
||||
// Server is the main MCP server struct
|
||||
@[heap]
|
||||
pub struct Server {
|
||||
ServerConfiguration
|
||||
pub mut:
|
||||
client_config ClientConfiguration
|
||||
handler jsonrpc.Handler
|
||||
backend Backend
|
||||
}
|
||||
|
||||
// start starts the MCP server
|
||||
pub fn (mut s Server) start() ! {
|
||||
log.info('Starting MCP server')
|
||||
for {
|
||||
// Read a message from stdin
|
||||
message := os.get_line()
|
||||
if message == '' {
|
||||
time.sleep(10000) // prevent cpu spinning
|
||||
continue
|
||||
}
|
||||
|
||||
// Handle the message using the JSON-RPC handler
|
||||
response := s.handler.handle(message) or {
|
||||
log.error('message: ${message}')
|
||||
log.error('Error handling message: ${err}')
|
||||
|
||||
// Try to extract the request ID
|
||||
id := jsonrpc.decode_request_id(message) or { 0 }
|
||||
|
||||
// Create an internal error response
|
||||
error_response := jsonrpc.new_error(id, jsonrpc.internal_error).encode()
|
||||
print(error_response)
|
||||
continue
|
||||
}
|
||||
|
||||
// Send the response only if it's not empty (notifications return empty responses)
|
||||
if response.len > 0 {
|
||||
s.send(response)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// send sends a response to the client
|
||||
pub fn (mut s Server) send(response string) {
|
||||
// Send the response
|
||||
println(response)
|
||||
flush_stdout()
|
||||
}
|
||||
@@ -3,15 +3,16 @@ module main
|
||||
import freeflowuniverse.herolib.ai.mcp.vcode
|
||||
|
||||
fn main() {
|
||||
// Create a new MCP server
|
||||
mut server := vcode.new_mcp_server() or {
|
||||
eprintln('Failed to create MCP server: ${err}')
|
||||
// Create a VCode instance
|
||||
v := &vcode.VCode{}
|
||||
|
||||
// Create a placeholder MCP server (actual implementation pending mcpcore fixes)
|
||||
result := vcode.new_mcp_server(v) or {
|
||||
// Note: Removed eprintln() as it interferes with STDIO transport JSON-RPC communication
|
||||
return
|
||||
}
|
||||
|
||||
// Start the server
|
||||
server.start() or {
|
||||
eprintln('Failed to start MCP server: ${err}')
|
||||
return
|
||||
}
|
||||
// Note: Removed println() as it interferes with STDIO transport JSON-RPC communication
|
||||
// TODO: Implement actual MCP server startup once mcpcore module is fixed
|
||||
_ = result // Use the result to avoid unused variable warning
|
||||
}
|
||||
|
||||
BIN
lib/ai/mcp/vcode/cmd/vcode_mcp
Executable file
BIN
lib/ai/mcp/vcode/cmd/vcode_mcp
Executable file
Binary file not shown.
@@ -1,33 +0,0 @@
|
||||
module vcode
|
||||
|
||||
import freeflowuniverse.herolib.ai.mcp
|
||||
import freeflowuniverse.herolib.ai.mcp.logger
|
||||
|
||||
@[heap]
|
||||
pub struct VCode {
|
||||
v_version string = '0.1.0'
|
||||
}
|
||||
|
||||
pub fn new_mcp_server(v &VCode) !&mcp.Server {
|
||||
logger.info('Creating new Developer MCP server')
|
||||
|
||||
// Initialize the server with the empty handlers map
|
||||
mut server := mcp.new_server(mcp.MemoryBackend{
|
||||
tools: {
|
||||
'get_function_from_file': get_function_from_file_tool
|
||||
'write_vfile': write_vfile_tool
|
||||
}
|
||||
tool_handlers: {
|
||||
'get_function_from_file': v.get_function_from_file_tool_handler
|
||||
'write_vfile': v.write_vfile_tool_handler
|
||||
}
|
||||
}, mcp.ServerParams{
|
||||
config: mcp.ServerConfiguration{
|
||||
server_info: mcp.ServerInfo{
|
||||
name: 'vcode'
|
||||
version: '1.0.0'
|
||||
}
|
||||
}
|
||||
})!
|
||||
return server
|
||||
}
|
||||
@@ -1,7 +1,7 @@
|
||||
module vcode
|
||||
|
||||
import freeflowuniverse.herolib.ai.mcp
|
||||
import freeflowuniverse.herolib.core.code
|
||||
import freeflowuniverse.herolib.develop.codetools as code
|
||||
import freeflowuniverse.herolib.schemas.jsonschema
|
||||
import x.json2 { Any }
|
||||
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
module vcode
|
||||
|
||||
import freeflowuniverse.herolib.ai.mcp
|
||||
import freeflowuniverse.herolib.core.code
|
||||
import freeflowuniverse.herolib.develop.codetools as code
|
||||
import freeflowuniverse.herolib.schemas.jsonschema
|
||||
import x.json2 { Any }
|
||||
|
||||
|
||||
@@ -1,27 +1,12 @@
|
||||
module pugconvert
|
||||
module vcode
|
||||
|
||||
import freeflowuniverse.herolib.ai.mcp
|
||||
import freeflowuniverse.herolib.ai.mcp.logger
|
||||
import freeflowuniverse.herolib.schemas.jsonrpc
|
||||
import freeflowuniverse.herolib.ai.mcp.vcode.logic
|
||||
|
||||
pub fn new_mcp_server() !&mcp.Server {
|
||||
logger.info('Creating new Developer MCP server')
|
||||
// Note: Removed logger.info() as it interferes with STDIO transport JSON-RPC communication
|
||||
|
||||
// Initialize the server with the empty handlers map
|
||||
mut server := mcp.new_server(mcp.MemoryBackend{
|
||||
tools: {
|
||||
'pugconvert': specs
|
||||
}
|
||||
tool_handlers: {
|
||||
'pugconvert': handler
|
||||
}
|
||||
}, mcp.ServerParams{
|
||||
config: mcp.ServerConfiguration{
|
||||
server_info: mcp.ServerInfo{
|
||||
name: 'developer'
|
||||
version: '1.0.0'
|
||||
}
|
||||
}
|
||||
})!
|
||||
return server
|
||||
// Create a VCode instance and delegate to the logic module
|
||||
v := &logic.VCode{}
|
||||
return logic.new_mcp_server(v)
|
||||
}
|
||||
|
||||
15
lib/ai/mcp/vcode/server.v
Normal file
15
lib/ai/mcp/vcode/server.v
Normal file
@@ -0,0 +1,15 @@
|
||||
module vcode
|
||||
|
||||
// Placeholder VCode struct for the MCP vcode server
|
||||
@[heap]
|
||||
pub struct VCode {
|
||||
v_version string = '0.1.0'
|
||||
}
|
||||
|
||||
// Placeholder function that will be implemented once mcpcore compilation issues are resolved
|
||||
pub fn new_mcp_server(v &VCode) !string {
|
||||
// Note: Removed logger.info() as it interferes with STDIO transport JSON-RPC communication
|
||||
|
||||
// TODO: Implement actual MCP server creation once mcpcore module is fixed
|
||||
return 'VCode MCP server placeholder - version ${v.v_version}'
|
||||
}
|
||||
284
lib/ai/mcp/vcode/vlang.v
Normal file
284
lib/ai/mcp/vcode/vlang.v
Normal file
@@ -0,0 +1,284 @@
|
||||
module vcode
|
||||
|
||||
// import freeflowuniverse.herolib.ai.mcp
|
||||
// import freeflowuniverse.herolib.ai.mcp.logger
|
||||
import os
|
||||
import log
|
||||
|
||||
fn get_module_dir(mod string) string {
|
||||
module_parts := mod.trim_string_left('freeflowuniverse.herolib').split('.')
|
||||
return '${os.home_dir()}/code/github/incubaid/herolib/lib/${module_parts.join('/')}'
|
||||
}
|
||||
|
||||
// given a module path and a type name, returns the type definition of that type within that module
|
||||
// for instance: get_type_from_module('lib/mcp/developer/vlang.v', 'Developer') might return struct Developer {...}
|
||||
fn get_type_from_module(module_path string, type_name string) !string {
|
||||
println('Looking for type ${type_name} in module ${module_path}')
|
||||
v_files := list_v_files(module_path) or {
|
||||
return error('Failed to list V files in ${module_path}: ${err}')
|
||||
}
|
||||
|
||||
for v_file in v_files {
|
||||
println('Checking file: ${v_file}')
|
||||
content := os.read_file(v_file) or { return error('Failed to read file ${v_file}: ${err}') }
|
||||
|
||||
// Look for both regular and pub struct declarations
|
||||
mut type_str := 'struct ${type_name} {'
|
||||
mut i := content.index(type_str) or { -1 }
|
||||
mut is_pub := false
|
||||
|
||||
if i == -1 {
|
||||
// Try with pub struct
|
||||
type_str = 'pub struct ${type_name} {'
|
||||
i = content.index(type_str) or { -1 }
|
||||
is_pub = true
|
||||
}
|
||||
|
||||
if i == -1 {
|
||||
type_import := content.split_into_lines().filter(it.contains('import')
|
||||
&& it.contains(type_name))
|
||||
if type_import.len > 0 {
|
||||
log.debug('debugzoooo')
|
||||
mod := type_import[0].trim_space().trim_string_left('import ').all_before(' ')
|
||||
return get_type_from_module(get_module_dir(mod), type_name)
|
||||
}
|
||||
continue
|
||||
}
|
||||
println('Found type ${type_name} in ${v_file} at position ${i}')
|
||||
|
||||
// Find the start of the struct definition including comments
|
||||
mut comment_start := i
|
||||
mut line_start := i
|
||||
|
||||
// Find the start of the line containing the struct definition
|
||||
for j := i; j >= 0; j-- {
|
||||
if j == 0 || content[j - 1] == `\n` {
|
||||
line_start = j
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
// Find the start of the comment block (if any)
|
||||
for j := line_start - 1; j >= 0; j-- {
|
||||
if j == 0 {
|
||||
comment_start = 0
|
||||
break
|
||||
}
|
||||
|
||||
// If we hit a blank line or a non-comment line, stop
|
||||
if content[j] == `\n` {
|
||||
if j > 0 && j < content.len - 1 {
|
||||
// Check if the next line starts with a comment
|
||||
next_line_start := j + 1
|
||||
if next_line_start < content.len && content[next_line_start] != `/` {
|
||||
comment_start = j + 1
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Find the end of the struct definition
|
||||
closing_i := find_closing_brace(content, i + type_str.len) or {
|
||||
return error('could not find where declaration for type ${type_name} ends')
|
||||
}
|
||||
|
||||
// Get the full struct definition including the struct declaration line
|
||||
full_struct := content.substr(line_start, closing_i + 1)
|
||||
println('Found struct definition:\n${full_struct}')
|
||||
|
||||
// Return the full struct definition
|
||||
return full_struct
|
||||
}
|
||||
|
||||
return error('type ${type_name} not found in module ${module_path}')
|
||||
}
|
||||
|
||||
// given a module path and a function name, returns the function definition of that function within that module
|
||||
// for instance: get_function_from_module('lib/mcp/developer/vlang.v', 'develop') might return fn develop(...) {...}
|
||||
fn get_function_from_module(module_path string, function_name string) !string {
|
||||
v_files := list_v_files(module_path) or {
|
||||
return error('Failed to list V files in ${module_path}: ${err}')
|
||||
}
|
||||
|
||||
println('Found ${v_files.len} V files in ${module_path}')
|
||||
for v_file in v_files {
|
||||
println('Checking file: ${v_file}')
|
||||
result := get_function_from_file(v_file, function_name) or {
|
||||
println('Function not found in ${v_file}: ${err}')
|
||||
continue
|
||||
}
|
||||
println('Found function ${function_name} in ${v_file}')
|
||||
return result
|
||||
}
|
||||
|
||||
return error('function ${function_name} not found in module ${module_path}')
|
||||
}
|
||||
|
||||
fn find_closing_brace(content string, start_i int) ?int {
|
||||
mut brace_count := 1
|
||||
for i := start_i; i < content.len; i++ {
|
||||
if content[i] == `{` {
|
||||
brace_count++
|
||||
} else if content[i] == `}` {
|
||||
brace_count--
|
||||
if brace_count == 0 {
|
||||
return i
|
||||
}
|
||||
}
|
||||
}
|
||||
return none
|
||||
}
|
||||
|
||||
// get_function_from_file parses a V file and extracts a specific function block including its comments
|
||||
// ARGS:
|
||||
// file_path string - path to the V file
|
||||
// function_name string - name of the function to extract
|
||||
// RETURNS: string - the function block including comments, or empty string if not found
|
||||
fn get_function_from_file(file_path string, function_name string) !string {
|
||||
content := os.read_file(file_path) or {
|
||||
return error('Failed to read file: ${file_path}: ${err}')
|
||||
}
|
||||
|
||||
lines := content.split_into_lines()
|
||||
mut result := []string{}
|
||||
mut in_function := false
|
||||
mut brace_count := 0
|
||||
mut comment_block := []string{}
|
||||
|
||||
for i, line in lines {
|
||||
trimmed := line.trim_space()
|
||||
|
||||
// Collect comments that might be above the function
|
||||
if trimmed.starts_with('//') {
|
||||
if !in_function {
|
||||
comment_block << line
|
||||
} else if brace_count > 0 {
|
||||
result << line
|
||||
}
|
||||
continue
|
||||
}
|
||||
|
||||
// Check if we found the function
|
||||
if !in_function && (trimmed.starts_with('fn ${function_name}(')
|
||||
|| trimmed.starts_with('pub fn ${function_name}(')) {
|
||||
in_function = true
|
||||
// Add collected comments
|
||||
result << comment_block
|
||||
comment_block = []
|
||||
result << line
|
||||
if line.contains('{') {
|
||||
brace_count++
|
||||
}
|
||||
continue
|
||||
}
|
||||
|
||||
// If we're inside the function, keep track of braces
|
||||
if in_function {
|
||||
result << line
|
||||
|
||||
for c in line {
|
||||
if c == `{` {
|
||||
brace_count++
|
||||
} else if c == `}` {
|
||||
brace_count--
|
||||
}
|
||||
}
|
||||
|
||||
// If brace_count is 0, we've reached the end of the function
|
||||
if brace_count == 0 && trimmed.contains('}') {
|
||||
return result.join('\n')
|
||||
}
|
||||
} else {
|
||||
// Reset comment block if we pass a blank line
|
||||
if trimmed == '' {
|
||||
comment_block = []
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if !in_function {
|
||||
return error('Function "${function_name}" not found in ${file_path}')
|
||||
}
|
||||
|
||||
return result.join('\n')
|
||||
}
|
||||
|
||||
// list_v_files returns all .v files in a directory (non-recursive), excluding generated files ending with _.v
|
||||
fn list_v_files(dir string) ![]string {
|
||||
files := os.ls(dir) or { return error('Error listing directory: ${err}') }
|
||||
|
||||
mut v_files := []string{}
|
||||
for file in files {
|
||||
if file.ends_with('.v') && !file.ends_with('_.v') {
|
||||
filepath := os.join_path(dir, file)
|
||||
v_files << filepath
|
||||
}
|
||||
}
|
||||
|
||||
return v_files
|
||||
}
|
||||
|
||||
// test runs v test on the specified file or directory
|
||||
pub fn vtest(fullpath string) !string {
|
||||
// Note: Removed logger.info() as it interferes with STDIO transport JSON-RPC communication
|
||||
if !os.exists(fullpath) {
|
||||
return error('File or directory does not exist: ${fullpath}')
|
||||
}
|
||||
if os.is_dir(fullpath) {
|
||||
mut results := ''
|
||||
for item in list_v_files(fullpath)! {
|
||||
results += vtest(item)!
|
||||
results += '\n-----------------------\n'
|
||||
}
|
||||
return results
|
||||
} else {
|
||||
cmd := 'v -gc none -stats -enable-globals -show-c-output -keepc -n -w -cg -o /tmp/tester.c -g -cc tcc test ${fullpath}'
|
||||
// Note: Removed logger.debug() as it interferes with STDIO transport JSON-RPC communication
|
||||
result := os.execute(cmd)
|
||||
if result.exit_code != 0 {
|
||||
return error('Test failed for ${fullpath} with exit code ${result.exit_code}\n${result.output}')
|
||||
} else {
|
||||
// Note: Removed logger.info() as it interferes with STDIO transport JSON-RPC communication
|
||||
}
|
||||
return 'Command: ${cmd}\nExit code: ${result.exit_code}\nOutput:\n${result.output}'
|
||||
}
|
||||
}
|
||||
|
||||
// vvet runs v vet on the specified file or directory
|
||||
pub fn vvet(fullpath string) !string {
|
||||
// Note: Removed logger.info() as it interferes with STDIO transport JSON-RPC communication
|
||||
if !os.exists(fullpath) {
|
||||
return error('File or directory does not exist: ${fullpath}')
|
||||
}
|
||||
|
||||
if os.is_dir(fullpath) {
|
||||
mut results := ''
|
||||
files := list_v_files(fullpath) or { return error('Error listing V files: ${err}') }
|
||||
for file in files {
|
||||
results += vet_file(file) or {
|
||||
// Note: Removed logger.error() as it interferes with STDIO transport JSON-RPC communication
|
||||
return error('Failed to vet ${file}: ${err}')
|
||||
}
|
||||
results += '\n-----------------------\n'
|
||||
}
|
||||
return results
|
||||
} else {
|
||||
return vet_file(fullpath)
|
||||
}
|
||||
}
|
||||
|
||||
// vet_file runs v vet on a single file
|
||||
fn vet_file(file string) !string {
|
||||
cmd := 'v vet -v -w ${file}'
|
||||
// Note: Removed logger.debug() as it interferes with STDIO transport JSON-RPC communication
|
||||
result := os.execute(cmd)
|
||||
if result.exit_code != 0 {
|
||||
return error('Vet failed for ${file} with exit code ${result.exit_code}\n${result.output}')
|
||||
} else {
|
||||
// Note: Removed logger.info() as it interferes with STDIO transport JSON-RPC communication
|
||||
}
|
||||
return 'Command: ${cmd}\nExit code: ${result.exit_code}\nOutput:\n${result.output}'
|
||||
}
|
||||
|
||||
// cmd := 'v -gc none -stats -enable-globals -show-c-output -keepc -n -w -cg -o /tmp/tester.c -g -cc tcc ${fullpath}'
|
||||
52
lib/ai/mcp/vcode/vlang_tools.v
Normal file
52
lib/ai/mcp/vcode/vlang_tools.v
Normal file
@@ -0,0 +1,52 @@
|
||||
module vcode
|
||||
|
||||
// import freeflowuniverse.herolib.ai.mcpcore
|
||||
// import freeflowuniverse.herolib.core.code.vlang_utils
|
||||
// import freeflowuniverse.herolib.core.code
|
||||
// import freeflowuniverse.herolib.schemas.jsonschema
|
||||
// import x.json2
|
||||
|
||||
// TODO: Uncomment when mcpcore module is fixed
|
||||
/*
|
||||
const get_function_from_file_tool = mcpcore.Tool{
|
||||
name: 'get_function_from_file'
|
||||
description: 'get_function_from_file parses a V file and extracts a specific function block including its comments
|
||||
ARGS:
|
||||
file_path string - path to the V file
|
||||
function_name string - name of the function to extract
|
||||
RETURNS: string - the function block including comments, or empty string if not found'
|
||||
input_schema: jsonschema.Schema{
|
||||
typ: 'object'
|
||||
properties: {
|
||||
'file_path': jsonschema.SchemaRef(jsonschema.Schema{
|
||||
typ: 'string'
|
||||
})
|
||||
'function_name': jsonschema.SchemaRef(jsonschema.Schema{
|
||||
typ: 'string'
|
||||
})
|
||||
}
|
||||
required: ['file_path', 'function_name']
|
||||
}
|
||||
}
|
||||
*/
|
||||
|
||||
// TODO: Uncomment when mcpcore module is fixed
|
||||
/*
|
||||
pub fn (d &VCode) get_function_from_file_tool_handler(arguments map[string]Any) !mcpcore.ToolCallResult {
|
||||
file_path := arguments['file_path'] or {
|
||||
return mcpcore.error_tool_call_result(error('Missing file_path argument'))
|
||||
}.str()
|
||||
function_name := arguments['function_name'] or {
|
||||
return mcpcore.error_tool_call_result(error('Missing function_name argument'))
|
||||
}.str()
|
||||
|
||||
// TODO: Implement actual function extraction from file
|
||||
// For now, return a placeholder message
|
||||
result := 'Function extraction from ${file_path} for function ${function_name} is not yet implemented'
|
||||
|
||||
return mcpcore.ToolCallResult{
|
||||
is_error: false
|
||||
content: mcpcore.result_to_mcp_tool_contents[string](result)
|
||||
}
|
||||
}
|
||||
*/
|
||||
72
lib/ai/mcp/vcode/write_vfile_tool.v
Normal file
72
lib/ai/mcp/vcode/write_vfile_tool.v
Normal file
@@ -0,0 +1,72 @@
|
||||
module vcode
|
||||
|
||||
// import freeflowuniverse.herolib.ai.mcpcore
|
||||
// TODO: Uncomment when mcpcore module is fixed
|
||||
/*
|
||||
// import freeflowuniverse.herolib.develop.codetools as code
|
||||
import freeflowuniverse.herolib.schemas.jsonschema
|
||||
import x.json2 { Any }
|
||||
|
||||
const write_vfile_tool = mcpcore.Tool{
|
||||
name: 'write_vfile'
|
||||
description: 'write_vfile parses a V code string into a VFile and writes it to the specified path
|
||||
ARGS:
|
||||
path string - directory path where to write the file
|
||||
code string - V code content to write
|
||||
format bool - whether to format the code (optional, default: false)
|
||||
overwrite bool - whether to overwrite existing file (optional, default: false)
|
||||
prefix string - prefix to add to the filename (optional, default: "")
|
||||
RETURNS: string - success message with the path of the written file'
|
||||
input_schema: jsonschema.Schema{
|
||||
typ: 'object'
|
||||
properties: {
|
||||
'path': jsonschema.SchemaRef(jsonschema.Schema{
|
||||
typ: 'string'
|
||||
})
|
||||
'code': jsonschema.SchemaRef(jsonschema.Schema{
|
||||
typ: 'string'
|
||||
})
|
||||
'format': jsonschema.SchemaRef(jsonschema.Schema{
|
||||
typ: 'boolean'
|
||||
})
|
||||
'overwrite': jsonschema.SchemaRef(jsonschema.Schema{
|
||||
typ: 'boolean'
|
||||
})
|
||||
'prefix': jsonschema.SchemaRef(jsonschema.Schema{
|
||||
typ: 'string'
|
||||
})
|
||||
}
|
||||
required: ['path', 'code']
|
||||
}
|
||||
}
|
||||
*/
|
||||
|
||||
// TODO: Uncomment when mcpcore module is fixed
|
||||
/*
|
||||
pub fn (d &VCode) write_vfile_tool_handler(arguments map[string]Any) !mcpcore.ToolCallResult {
|
||||
path := arguments['path'] or {
|
||||
return mcpcore.error_tool_call_result(error('Missing path argument'))
|
||||
}.str()
|
||||
code_str := arguments['code'] or {
|
||||
return mcpcore.error_tool_call_result(error('Missing code argument'))
|
||||
}.str()
|
||||
|
||||
// Parse optional parameters with defaults
|
||||
format := if 'format' in arguments { arguments['format'] or { false }.bool() } else { false }
|
||||
overwrite := if 'overwrite' in arguments {
|
||||
arguments['overwrite'] or { false }.bool()
|
||||
} else {
|
||||
false
|
||||
}
|
||||
prefix := if 'prefix' in arguments { arguments['prefix'] or { '' }.str() } else { '' }
|
||||
|
||||
// TODO: Implement actual V file parsing and writing
|
||||
// For now, return a placeholder message
|
||||
result := 'Writing V file to ${path} with code length ${code_str.len} (format: ${format}, overwrite: ${overwrite}, prefix: "${prefix}") is not yet implemented'
|
||||
|
||||
return mcpcore.ToolCallResult{
|
||||
is_error: false
|
||||
content: mcpcore.result_to_mcp_tool_contents[string](result)
|
||||
}
|
||||
}
|
||||
*/
|
||||
@@ -1,8 +1,8 @@
|
||||
module mcp
|
||||
module mcpcore
|
||||
|
||||
import x.json2
|
||||
|
||||
interface Backend {
|
||||
pub interface Backend {
|
||||
// Resource methods
|
||||
resource_exists(uri string) !bool
|
||||
resource_get(uri string) !Resource
|
||||
@@ -1,4 +1,4 @@
|
||||
module mcp
|
||||
module mcpcore
|
||||
|
||||
import x.json2
|
||||
|
||||
@@ -6,8 +6,8 @@ import freeflowuniverse.herolib.osal.core as osal
|
||||
// import freeflowuniverse.herolib.ai.mcp.vcode
|
||||
// import freeflowuniverse.herolib.ai.mcp.mcpgen
|
||||
// import freeflowuniverse.herolib.ai.mcp.baobab
|
||||
import freeflowuniverse.herolib.ai.mcp.rhai.mcp as rhai_mcp
|
||||
import freeflowuniverse.herolib.ai.mcp.rust
|
||||
// import freeflowuniverse.herolib.ai.mcp.rhai.mcp as rhai_mcp
|
||||
// import freeflowuniverse.herolib.ai.mcp.rust
|
||||
|
||||
fn main() {
|
||||
do() or { panic(err) }
|
||||
@@ -1,4 +1,4 @@
|
||||
module mcp
|
||||
module mcpcore
|
||||
|
||||
import time
|
||||
import os
|
||||
@@ -27,6 +27,8 @@ pub fn new_server(backend Backend, params ServerParams) !&Server {
|
||||
// Core handlers
|
||||
'initialize': server.initialize_handler
|
||||
'notifications/initialized': initialized_notification_handler
|
||||
// Logging handlers
|
||||
'logging/setLevel': server.logging_set_level_handler
|
||||
// Resource handlers
|
||||
'resources/list': server.resources_list_handler
|
||||
'resources/read': server.resources_read_handler
|
||||
@@ -1,4 +1,4 @@
|
||||
module mcp
|
||||
module mcpcore
|
||||
|
||||
pub fn result_to_mcp_tool_contents[T](result T) []ToolContent {
|
||||
return [result_to_mcp_tool_content[T](result)]
|
||||
@@ -1,4 +1,4 @@
|
||||
module mcp
|
||||
module mcpcore
|
||||
|
||||
import time
|
||||
import os
|
||||
@@ -1,4 +1,4 @@
|
||||
module mcp
|
||||
module mcpcore
|
||||
|
||||
import freeflowuniverse.herolib.schemas.jsonrpc
|
||||
import json
|
||||
@@ -1,4 +1,4 @@
|
||||
module mcp
|
||||
module mcpcore
|
||||
|
||||
import time
|
||||
import os
|
||||
@@ -1,4 +1,4 @@
|
||||
module mcp
|
||||
module mcpcore
|
||||
|
||||
import time
|
||||
import os
|
||||
@@ -1,4 +1,4 @@
|
||||
module mcp
|
||||
module mcpcore
|
||||
|
||||
import time
|
||||
import os
|
||||
@@ -1,8 +1,5 @@
|
||||
module mcp
|
||||
module mcpcore
|
||||
|
||||
import time
|
||||
import os
|
||||
import log
|
||||
import x.json2
|
||||
import json
|
||||
import freeflowuniverse.herolib.schemas.jsonrpc
|
||||
@@ -113,11 +110,9 @@ fn (mut s Server) tools_call_handler(data string) !string {
|
||||
}
|
||||
}
|
||||
|
||||
log.error('Calling tool: ${tool_name} with arguments: ${arguments}')
|
||||
// Note: Removed log.error() calls as they interfere with STDIO transport JSON-RPC communication
|
||||
// Call the tool with the provided arguments
|
||||
result := s.backend.tool_call(tool_name, arguments)!
|
||||
|
||||
log.error('Received result from tool: ${tool_name} with result: ${result}')
|
||||
// Create a success response with the result
|
||||
response := jsonrpc.new_response_generic[ToolCallResult](request_map['id'].int(),
|
||||
result)
|
||||
@@ -137,7 +132,7 @@ pub fn (mut s Server) send_tools_list_changed_notification() ! {
|
||||
notification := jsonrpc.new_blank_notification('notifications/tools/list_changed')
|
||||
s.send(json.encode(notification))
|
||||
// Send the notification to all connected clients
|
||||
log.info('Sending tools list changed notification: ${json.encode(notification)}')
|
||||
// Note: Removed log.info() as it interferes with STDIO transport JSON-RPC communication
|
||||
}
|
||||
|
||||
pub fn error_tool_call_result(err IError) ToolCallResult {
|
||||
@@ -1,4 +1,4 @@
|
||||
module mcp
|
||||
module mcpcore
|
||||
|
||||
import time
|
||||
import os
|
||||
@@ -1,4 +1,4 @@
|
||||
module mcp
|
||||
module mcpcore
|
||||
|
||||
import freeflowuniverse.herolib.schemas.jsonrpc
|
||||
import json
|
||||
@@ -1,4 +1,4 @@
|
||||
module mcp
|
||||
module mcpcore
|
||||
|
||||
import freeflowuniverse.herolib.schemas.jsonrpc
|
||||
|
||||
62
lib/ai/mcpcore/server.v
Normal file
62
lib/ai/mcpcore/server.v
Normal file
@@ -0,0 +1,62 @@
|
||||
module mcpcore
|
||||
|
||||
import time
|
||||
import os
|
||||
import freeflowuniverse.herolib.schemas.jsonrpc
|
||||
|
||||
// Server is the main MCP server struct
|
||||
@[heap]
|
||||
pub struct Server {
|
||||
ServerConfiguration
|
||||
pub mut:
|
||||
client_config ClientConfiguration
|
||||
handler jsonrpc.Handler
|
||||
backend Backend
|
||||
}
|
||||
|
||||
// start starts the MCP server
|
||||
pub fn (mut s Server) start() ! {
|
||||
// Note: Removed log.info() as it interferes with STDIO transport JSON-RPC communication
|
||||
for {
|
||||
// Read a message from stdin
|
||||
message := os.get_line()
|
||||
if message == '' {
|
||||
time.sleep(10000) // prevent cpu spinning
|
||||
continue
|
||||
}
|
||||
|
||||
// Parse the JSON-RPC request
|
||||
request := jsonrpc.decode_request(message) or {
|
||||
// Note: Removed stderr logging as it can interfere with MCP Inspector
|
||||
// Try to extract the request ID for error response
|
||||
id := jsonrpc.decode_request_id(message) or { 0 }
|
||||
// Create an invalid request error response
|
||||
error_response := jsonrpc.new_error(id, jsonrpc.invalid_request).encode()
|
||||
println(error_response)
|
||||
continue
|
||||
}
|
||||
|
||||
// Handle the message using the JSON-RPC handler
|
||||
response := s.handler.handle(request) or {
|
||||
// Note: Removed stderr logging as it can interfere with MCP Inspector
|
||||
|
||||
// Create an internal error response
|
||||
error_response := jsonrpc.new_error(request.id, jsonrpc.internal_error).encode()
|
||||
println(error_response)
|
||||
continue
|
||||
}
|
||||
|
||||
// Send the response (notifications may return empty responses)
|
||||
response_str := response.encode()
|
||||
if response_str.len > 0 {
|
||||
s.send(response_str)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// send sends a response to the client
|
||||
pub fn (mut s Server) send(response string) {
|
||||
// Send the response
|
||||
println(response)
|
||||
flush_stdout()
|
||||
}
|
||||
@@ -16,13 +16,14 @@ pub fn playmacro(action playbook.Action) !string {
|
||||
|
||||
if action.name == 'employee_wiki' {
|
||||
return employee_wiki(p, sim)!
|
||||
} else if action.name == 'employees_wiki' {
|
||||
return employees_wiki(p, sim)!
|
||||
} else if action.name == 'department_wiki' {
|
||||
return department_wiki(p, sim)!
|
||||
} else if action.name == 'revenues_wiki' {
|
||||
return revenues_wiki(p, mut sim)!
|
||||
}
|
||||
// } else if action.name == 'employees_wiki' {
|
||||
// return employees_wiki(p, sim)!
|
||||
// } else if action.name == 'department_wiki' {
|
||||
// return department_wiki(p, sim)!
|
||||
// } else if action.name == 'revenues_wiki' {
|
||||
// return revenues_wiki(p, mut sim)!
|
||||
// }
|
||||
|
||||
return error("couldn't find macro '${action.name}' for bizmodel.")
|
||||
}
|
||||
@@ -62,8 +63,8 @@ fn employee_wiki(p paramsparser.Params, sim BizModel) !string {
|
||||
|
||||
// theme := 'light'
|
||||
// theme := 'dark' // Removed unused variable
|
||||
mut t := $tmpl('./templates/employee_old.md')
|
||||
return t
|
||||
// mut t := $tmpl('./templates/employee.md')
|
||||
return ''
|
||||
}
|
||||
|
||||
fn employees_wiki(p paramsparser.Params, sim BizModel) !string {
|
||||
|
||||
@@ -167,7 +167,7 @@ pub mut:
|
||||
}
|
||||
|
||||
// RecurrenceRule for recurring agenda items
|
||||
pub struct RecurrenceRule {
|
||||
pub struct PlannerRecurrenceRule {
|
||||
pub mut:
|
||||
frequency RecurrenceFrequency
|
||||
interval int = 1 // Every N frequency units
|
||||
|
||||
228
lib/core/code/improvements.md
Normal file
228
lib/core/code/improvements.md
Normal file
@@ -0,0 +1,228 @@
|
||||
# Code Review and Improvement Plan for HeroLib Code Module
|
||||
|
||||
## Overview
|
||||
The HeroLib `code` module provides utilities for parsing and generating V language code. It's designed to be a lightweight alternative to `v.ast` for code analysis and generation across multiple languages. While the module has good foundational structure, there are several areas that need improvement.
|
||||
|
||||
## Issues Identified
|
||||
|
||||
### 1. Incomplete TypeScript Generation Support
|
||||
- The `typescript()` method exists in some models but lacks comprehensive implementation
|
||||
- Missing TypeScript generation for complex types (arrays, maps, results)
|
||||
- No TypeScript interface generation for structs
|
||||
|
||||
### 2. Template System Issues
|
||||
- Some templates are empty (e.g., `templates/function/method.py`, `templates/comment/comment.py`)
|
||||
- Template usage is inconsistent across the codebase
|
||||
- No clear separation between V and other language templates
|
||||
|
||||
### 3. Missing Parser Documentation Examples
|
||||
- README.md mentions codeparser but doesn't show how to use the parser from this module
|
||||
- No clear examples of parsing V files or modules
|
||||
|
||||
### 4. Incomplete Type Handling
|
||||
- The `parse_type` function doesn't handle all V language types comprehensively
|
||||
- Missing support for function types, sum types, and complex generics
|
||||
- No handling of optional types (`?Type`)
|
||||
|
||||
### 5. Code Structure and Consistency
|
||||
- Some functions lack proper error handling
|
||||
- Inconsistent naming conventions in test files
|
||||
- Missing documentation for several key functions
|
||||
|
||||
## Improvement Plan
|
||||
|
||||
### 1. Complete TypeScript Generation Implementation
|
||||
|
||||
**What needs to be done:**
|
||||
- Implement comprehensive TypeScript generation in `model_types.v`
|
||||
- Add TypeScript generation for all type variants
|
||||
- Create proper TypeScript interface generation in `model_struct.v`
|
||||
|
||||
**Specific fixes:**
|
||||
```v
|
||||
// In model_types.v, improve the typescript() method:
|
||||
pub fn (t Type) typescript() string {
|
||||
return match t {
|
||||
Map { 'Record<string, ${t.typ.typescript()}>' }
|
||||
Array { '${t.typ.typescript()}[]' }
|
||||
Object { t.name }
|
||||
Result { 'Promise<${t.typ.typescript()}>' } // Better representation for async operations
|
||||
Boolean { 'boolean' }
|
||||
Integer { 'number' }
|
||||
Alias { t.name }
|
||||
String { 'string' }
|
||||
Function { '(...args: any[]) => any' } // More appropriate for function types
|
||||
Void { 'void' }
|
||||
}
|
||||
}
|
||||
|
||||
// In model_struct.v, improve the typescript() method:
|
||||
pub fn (s Struct) typescript() string {
|
||||
name := texttools.pascal_case(s.name)
|
||||
fields := s.fields.map(it.typescript()).join('\n ')
|
||||
return 'export interface ${name} {\n ${fields}\n}'
|
||||
}
|
||||
```
|
||||
|
||||
### 2. Fix Template System
|
||||
|
||||
**What needs to be done:**
|
||||
- Remove empty Python template files
|
||||
- Ensure all templates are properly implemented
|
||||
- Add template support for other languages
|
||||
|
||||
**Specific fixes:**
|
||||
- Delete `templates/function/method.py` and `templates/comment/comment.py` if they're not needed
|
||||
- Add proper TypeScript templates for struct and interface generation
|
||||
- Create consistent template naming conventions
|
||||
|
||||
### 3. Improve Parser Documentation
|
||||
|
||||
**What needs to be done:**
|
||||
- Add clear examples in README.md showing how to use the parser
|
||||
- Document the parsing functions with practical examples
|
||||
|
||||
**Specific fixes:**
|
||||
Add to README.md:
|
||||
```markdown
|
||||
## Parsing V Code
|
||||
|
||||
The code module provides utilities to parse V code into structured models:
|
||||
|
||||
```v
|
||||
import freeflowuniverse.herolib.core.code
|
||||
|
||||
// Parse a V file
|
||||
content := os.read_file('example.v') or { panic(err) }
|
||||
vfile := code.parse_vfile(content) or { panic(err) }
|
||||
|
||||
// Access parsed information
|
||||
println('Module: ${vfile.mod}')
|
||||
println('Number of functions: ${vfile.functions().len}')
|
||||
println('Number of structs: ${vfile.structs().len}')
|
||||
|
||||
// Parse individual components
|
||||
function := code.parse_function(fn_code_string) or { panic(err) }
|
||||
struct_ := code.parse_struct(struct_code_string) or { panic(err) }
|
||||
```
|
||||
|
||||
### 4. Complete Type Handling
|
||||
|
||||
**What needs to be done:**
|
||||
- Extend `parse_type` to handle more complex V types
|
||||
- Add support for optional types (`?Type`)
|
||||
- Improve generic type parsing
|
||||
|
||||
**Specific fixes:**
|
||||
```v
|
||||
// In model_types.v, enhance parse_type function:
|
||||
pub fn parse_type(type_str string) Type {
|
||||
mut type_str_trimmed := type_str.trim_space()
|
||||
|
||||
// Handle optional types
|
||||
if type_str_trimmed.starts_with('?') {
|
||||
return Optional{parse_type(type_str_trimmed.all_after('?'))}
|
||||
}
|
||||
|
||||
// Handle function types
|
||||
if type_str_trimmed.starts_with('fn ') {
|
||||
// Parse function signature
|
||||
return Function{}
|
||||
}
|
||||
|
||||
// Handle sum types
|
||||
if type_str_trimmed.contains('|') {
|
||||
types := type_str_trimmed.split('|').map(parse_type(it.trim_space()))
|
||||
return Sum{types}
|
||||
}
|
||||
|
||||
// Existing parsing logic...
|
||||
}
|
||||
```
|
||||
|
||||
### 5. Code Structure Improvements
|
||||
|
||||
**What needs to be done:**
|
||||
- Add proper error handling to all parsing functions
|
||||
- Standardize naming conventions
|
||||
- Improve documentation consistency
|
||||
|
||||
**Specific fixes:**
|
||||
- Add error checking in `parse_function`, `parse_struct`, and other parsing functions
|
||||
- Ensure all public functions have clear documentation comments
|
||||
- Standardize test function names
|
||||
|
||||
## Module Generation to Other Languages
|
||||
|
||||
### Current Implementation
|
||||
The current code shows basic TypeScript generation support, but it's incomplete. The generation should:
|
||||
|
||||
1. **Support multiple languages**: The code structure allows for multi-language generation, but only TypeScript has partial implementation
|
||||
2. **Use templates consistently**: All language generation should use the template system
|
||||
3. **Separate language-specific code**: Each language should have its own generation module
|
||||
|
||||
### What Needs to Move to Other Modules
|
||||
|
||||
**TypeScript Generation Module:**
|
||||
- Move all TypeScript-specific generation code to a new `typescript` module
|
||||
- Create TypeScript templates for structs, interfaces, and functions
|
||||
- Add proper TypeScript formatting support
|
||||
|
||||
**Example Structure:**
|
||||
```
|
||||
lib/core/code/
|
||||
├── model_types.v # Core type models (language agnostic)
|
||||
├── model_struct.v # Core struct/function models (language agnostic)
|
||||
└── typescript/ # TypeScript-specific generation
|
||||
├── generator.v # TypeScript generation logic
|
||||
└── templates/ # TypeScript templates
|
||||
```
|
||||
|
||||
### Parser Usage Examples (to add to README.md)
|
||||
|
||||
```v
|
||||
// Parse a V file into a structured representation
|
||||
content := os.read_file('mymodule/example.v') or { panic(err) }
|
||||
vfile := code.parse_vfile(content)!
|
||||
|
||||
// Extract all functions
|
||||
functions := vfile.functions()
|
||||
println('Found ${functions.len} functions')
|
||||
|
||||
// Extract all structs
|
||||
structs := vfile.structs()
|
||||
for s in structs {
|
||||
println('Struct: ${s.name}')
|
||||
for field in s.fields {
|
||||
println(' Field: ${field.name} (${field.typ.symbol()})')
|
||||
}
|
||||
}
|
||||
|
||||
// Find a specific function
|
||||
if greet_fn := vfile.get_function('greet') {
|
||||
println('Found function: ${greet_fn.name}')
|
||||
println('Parameters: ${greet_fn.params.map(it.name)}')
|
||||
println('Returns: ${greet_fn.result.typ.symbol()}')
|
||||
}
|
||||
|
||||
// Parse a function from string
|
||||
fn_code := '
|
||||
pub fn add(a int, b int) int {
|
||||
return a + b
|
||||
}
|
||||
'
|
||||
function := code.parse_function(fn_code)!
|
||||
println('Parsed function: ${function.name}')
|
||||
```
|
||||
|
||||
## Summary of Required Actions
|
||||
|
||||
1. **Implement complete TypeScript generation** across all model types
|
||||
2. **Remove empty template files** and organize templates properly
|
||||
3. **Enhance type parsing** to handle optional types, function types, and sum types
|
||||
4. **Add comprehensive parser documentation** with practical examples to README.md
|
||||
5. **Create language-specific generation modules** to separate concerns
|
||||
6. **Improve error handling** in all parsing functions
|
||||
7. **Standardize documentation and naming** conventions across the module
|
||||
|
||||
These improvements will make the code module more robust, easier to use, and better prepared for multi-language code generation.
|
||||
@@ -1,4 +1,4 @@
|
||||
module code
|
||||
module codetools
|
||||
|
||||
import freeflowuniverse.herolib.ui.console
|
||||
import os
|
||||
@@ -191,79 +191,3 @@ pub fn get_type_from_module(module_path string, type_name string) !string {
|
||||
|
||||
return error('type ${type_name} not found in module ${module_path}')
|
||||
}
|
||||
|
||||
// ===== V LANGUAGE TOOLS =====
|
||||
|
||||
// vtest runs v test on the specified file or directory
|
||||
// ARGS:
|
||||
// fullpath string - path to the file or directory to test
|
||||
// RETURNS:
|
||||
// string - test results output, or error if test fails
|
||||
pub fn vtest(fullpath string) !string {
|
||||
console.print_item('test ${fullpath}')
|
||||
if !os.exists(fullpath) {
|
||||
return error('File or directory does not exist: ${fullpath}')
|
||||
}
|
||||
if os.is_dir(fullpath) {
|
||||
mut results := ''
|
||||
for item in list_v_files(fullpath)! {
|
||||
results += vtest(item)!
|
||||
results += '\n-----------------------\n'
|
||||
}
|
||||
return results
|
||||
} else {
|
||||
cmd := 'v -gc none -stats -enable-globals -show-c-output -keepc -n -w -cg -o /tmp/tester.c -g -cc tcc test ${fullpath}'
|
||||
console.print_debug('Executing command: ${cmd}')
|
||||
result := os.execute(cmd)
|
||||
if result.exit_code != 0 {
|
||||
return error('Test failed for ${fullpath} with exit code ${result.exit_code}\n${result.output}')
|
||||
} else {
|
||||
console.print_item('Test completed for ${fullpath}')
|
||||
}
|
||||
return 'Command: ${cmd}\nExit code: ${result.exit_code}\nOutput:\n${result.output}'
|
||||
}
|
||||
}
|
||||
|
||||
// vet_file runs v vet on a single file
|
||||
// ARGS:
|
||||
// file string - path to the file to vet
|
||||
// RETURNS:
|
||||
// string - vet results output, or error if vet fails
|
||||
fn vet_file(file string) !string {
|
||||
cmd := 'v vet -v -w ${file}'
|
||||
console.print_debug('Executing command: ${cmd}')
|
||||
result := os.execute(cmd)
|
||||
if result.exit_code != 0 {
|
||||
return error('Vet failed for ${file} with exit code ${result.exit_code}\n${result.output}')
|
||||
} else {
|
||||
console.print_item('Vet completed for ${file}')
|
||||
}
|
||||
return 'Command: ${cmd}\nExit code: ${result.exit_code}\nOutput:\n${result.output}'
|
||||
}
|
||||
|
||||
// vvet runs v vet on the specified file or directory
|
||||
// ARGS:
|
||||
// fullpath string - path to the file or directory to vet
|
||||
// RETURNS:
|
||||
// string - vet results output, or error if vet fails
|
||||
pub fn vvet(fullpath string) !string {
|
||||
console.print_item('vet ${fullpath}')
|
||||
if !os.exists(fullpath) {
|
||||
return error('File or directory does not exist: ${fullpath}')
|
||||
}
|
||||
|
||||
if os.is_dir(fullpath) {
|
||||
mut results := ''
|
||||
files := list_v_files(fullpath) or { return error('Error listing V files: ${err}') }
|
||||
for file in files {
|
||||
results += vet_file(file) or {
|
||||
console.print_stderr('Failed to vet ${file}: ${err}')
|
||||
return error('Failed to vet ${file}: ${err}')
|
||||
}
|
||||
results += '\n-----------------------\n'
|
||||
}
|
||||
return results
|
||||
} else {
|
||||
return vet_file(fullpath)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -5,13 +5,15 @@ import freeflowuniverse.herolib.core.pathlib
|
||||
// Logger Factory
|
||||
pub struct LoggerFactoryArgs {
|
||||
pub mut:
|
||||
path string
|
||||
path string
|
||||
console_output bool = true
|
||||
}
|
||||
|
||||
pub fn new(args LoggerFactoryArgs) !Logger {
|
||||
mut p := pathlib.get_dir(path: args.path, create: true)!
|
||||
return Logger{
|
||||
path: p
|
||||
lastlog_time: 0
|
||||
path: p
|
||||
lastlog_time: 0
|
||||
console_output: args.console_output
|
||||
}
|
||||
}
|
||||
|
||||
@@ -3,6 +3,7 @@ module logger
|
||||
import os
|
||||
import freeflowuniverse.herolib.core.texttools
|
||||
import freeflowuniverse.herolib.data.ourtime
|
||||
import freeflowuniverse.herolib.ui.console
|
||||
|
||||
@[params]
|
||||
pub struct LogItemArgs {
|
||||
@@ -61,4 +62,26 @@ pub fn (mut l Logger) log(args_ LogItemArgs) ! {
|
||||
}
|
||||
f.writeln(content.trim_space_right())!
|
||||
f.close()
|
||||
|
||||
// Also write to console if enabled
|
||||
if l.console_output {
|
||||
l.write_to_console(args, t)!
|
||||
}
|
||||
}
|
||||
|
||||
// Write log message to console with clean formatting
|
||||
fn (mut l Logger) write_to_console(args LogItemArgs, t ourtime.OurTime) ! {
|
||||
timestamp := t.time().format_ss()
|
||||
error_indicator := if args.logtype == .error { 'ERROR' } else { 'INFO' }
|
||||
category := args.cat.trim_space()
|
||||
lines := args.log.split('\n')
|
||||
|
||||
for i, line in lines {
|
||||
if i == 0 {
|
||||
console.print_info('${timestamp} [${error_indicator}] [${category}] ${line}')
|
||||
} else {
|
||||
// Indent continuation lines
|
||||
console.print_info('${timestamp} [${error_indicator}] [${category}] ${line}')
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -6,8 +6,9 @@ import freeflowuniverse.herolib.core.pathlib
|
||||
@[heap]
|
||||
pub struct Logger {
|
||||
pub mut:
|
||||
path pathlib.Path
|
||||
lastlog_time i64 // to see in log format, every second we put a time down, we need to know if we are in a new second (logs can come in much faster)
|
||||
path pathlib.Path
|
||||
lastlog_time i64 // to see in log format, every second we put a time down, we need to know if we are in a new second (logs can come in much faster)
|
||||
console_output bool // whether to also output to console/stdout
|
||||
}
|
||||
|
||||
pub struct LogItem {
|
||||
|
||||
29
lib/core/redisclient/redisclient_cmd.v
Normal file
29
lib/core/redisclient/redisclient_cmd.v
Normal file
@@ -0,0 +1,29 @@
|
||||
module redisclient
|
||||
|
||||
import freeflowuniverse.herolib.data.resp
|
||||
|
||||
pub fn (mut r Redis) cmd_str(script string, args []string) !string {
|
||||
mut cmds := []string{}
|
||||
cmds << script
|
||||
cmds << args
|
||||
return r.send_expect_strnil(cmds)!
|
||||
}
|
||||
|
||||
pub fn (mut r Redis) cmd_list_str(script string, args []string) ![]string {
|
||||
mut cmds := []string{}
|
||||
cmds << script
|
||||
cmds << args
|
||||
response := r.send_expect_list(cmds)!
|
||||
mut result := []string{}
|
||||
for item in response {
|
||||
result << resp.get_redis_value(item)
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
pub fn (mut r Redis) cmd_int(script string, args []string) !int {
|
||||
mut cmds := []string{}
|
||||
cmds << script
|
||||
cmds << args
|
||||
return r.send_expect_int(cmds)!
|
||||
}
|
||||
@@ -282,6 +282,44 @@ pub fn (mut r Redis) scan(cursor int) !(string, []string) {
|
||||
return resp.get_redis_value(res[0]), values
|
||||
}
|
||||
|
||||
@[params]
|
||||
pub struct HScanOpts {
|
||||
pub mut:
|
||||
match string
|
||||
count int
|
||||
}
|
||||
|
||||
pub fn (mut r Redis) hscan(key string, cursor int, opts HScanOpts) !(string, []string) {
|
||||
mut cmd := ['HSCAN', key, cursor.str()]
|
||||
|
||||
if opts.match != '' {
|
||||
cmd << 'MATCH'
|
||||
cmd << opts.match
|
||||
}
|
||||
|
||||
if opts.count > 0 {
|
||||
cmd << 'COUNT'
|
||||
cmd << opts.count.str()
|
||||
}
|
||||
|
||||
res := r.send_expect_list(cmd)!
|
||||
if res[0] !is resp.RBString {
|
||||
return error('Redis HSCAN wrong response type (cursor)')
|
||||
}
|
||||
|
||||
if res[1] !is resp.RArray {
|
||||
return error('Redis HSCAN wrong response type (list content)')
|
||||
}
|
||||
|
||||
mut values := []string{}
|
||||
|
||||
for i in 0 .. resp.get_redis_array_len(res[1]) {
|
||||
values << resp.get_redis_value_by_index(res[1], i)
|
||||
}
|
||||
|
||||
return resp.get_redis_value(res[0]), values
|
||||
}
|
||||
|
||||
// Add the specified members to the set stored at key. Specified members that are already a member
|
||||
// of this set are ignored. If key does not exist, a new set is created before adding the specified members.
|
||||
// An error is returned when the value stored at key is not a set.
|
||||
|
||||
@@ -2,9 +2,10 @@ module redisclient
|
||||
|
||||
@[params]
|
||||
pub struct RedisURL {
|
||||
pub mut:
|
||||
address string = '127.0.0.1'
|
||||
port int = 6379
|
||||
// db int
|
||||
db int
|
||||
}
|
||||
|
||||
pub fn get_redis_url(url string) !RedisURL {
|
||||
@@ -20,5 +21,28 @@ pub fn get_redis_url(url string) !RedisURL {
|
||||
|
||||
pub fn core_get(url RedisURL) !&Redis {
|
||||
mut r := new('${url.address}:${url.port}')!
|
||||
if url.db > 0 {
|
||||
r.selectdb(url.db)!
|
||||
}
|
||||
return r
|
||||
}
|
||||
|
||||
// give a test db, if db is 0, we will set it on 31
|
||||
pub fn test_get(url_ RedisURL) !&Redis {
|
||||
mut url := url_
|
||||
if url.db == 0 {
|
||||
url = RedisURL{
|
||||
address: url_.address
|
||||
port: url_.port
|
||||
db: 15
|
||||
}
|
||||
}
|
||||
return core_get(url)!
|
||||
}
|
||||
|
||||
// delete the test db
|
||||
pub fn test_delete(url_ RedisURL) !&Redis {
|
||||
mut r := test_get(url_)!
|
||||
r.flushdb()!
|
||||
return r
|
||||
}
|
||||
|
||||
102
lib/crypt/herocrypt/herocrypt.v
Normal file
102
lib/crypt/herocrypt/herocrypt.v
Normal file
@@ -0,0 +1,102 @@
|
||||
module herocrypt
|
||||
|
||||
import freeflowuniverse.herolib.core.redisclient
|
||||
|
||||
// HeroCrypt provides a client for HeroDB's AGE cryptography features.
|
||||
pub struct HeroCrypt {
|
||||
pub mut:
|
||||
redis_client &redisclient.Redis
|
||||
}
|
||||
|
||||
// new returns a new HeroCrypt client
|
||||
// url e.g. localhost:6381 (default)
|
||||
// It pings the server to ensure a valid connection.
|
||||
pub fn new(url_ string) !&HeroCrypt {
|
||||
mut url := url_
|
||||
if url == '' {
|
||||
url = 'localhost:6381'
|
||||
}
|
||||
mut redis := redisclient.new(url)!
|
||||
redis.ping()!
|
||||
return &HeroCrypt{
|
||||
redis_client: redis
|
||||
}
|
||||
}
|
||||
|
||||
// new_default returns a new HeroCrypt client with the default URL.
|
||||
pub fn new_default() !&HeroCrypt {
|
||||
return new('')
|
||||
}
|
||||
|
||||
// -- Stateless (Ephemeral) Methods --
|
||||
|
||||
// gen_enc_keypair generates an ephemeral encryption keypair.
|
||||
// Returns: [recipient_public_key, identity_secret_key]
|
||||
pub fn (mut self HeroCrypt) gen_enc_keypair() ![]string {
|
||||
return self.redis_client.cmd_list_str('AGE', ['GENENC'])
|
||||
}
|
||||
|
||||
// encrypt encrypts a message with a recipient public key.
|
||||
pub fn (mut self HeroCrypt) encrypt(recipient_public_key string, message string) !string {
|
||||
return self.redis_client.cmd_str('AGE', ['ENCRYPT', recipient_public_key, message])
|
||||
}
|
||||
|
||||
// decrypt decrypts a ciphertext with an identity secret key.
|
||||
pub fn (mut self HeroCrypt) decrypt(identity_secret_key string, ciphertext_b64 string) !string {
|
||||
return self.redis_client.cmd_str('AGE', ['DECRYPT', identity_secret_key, ciphertext_b64])
|
||||
}
|
||||
|
||||
// gen_sign_keypair generates an ephemeral signing keypair.
|
||||
// Returns: [verify_public_key_b64, sign_secret_key_b64]
|
||||
pub fn (mut self HeroCrypt) gen_sign_keypair() ![]string {
|
||||
return self.redis_client.cmd_list_str('AGE', ['GENSIGN'])
|
||||
}
|
||||
|
||||
// sign signs a message with a signing secret key.
|
||||
pub fn (mut self HeroCrypt) sign(sign_secret_key_b64 string, message string) !string {
|
||||
return self.redis_client.cmd_str('AGE', ['SIGN', sign_secret_key_b64, message])
|
||||
}
|
||||
|
||||
// verify verifies a signature with a public verification key.
|
||||
pub fn (mut self HeroCrypt) verify(verify_public_key_b64 string, message string, signature_b64 string) !bool {
|
||||
res := self.redis_client.cmd_str('AGE', ['VERIFY', verify_public_key_b64, message, signature_b64])!
|
||||
return res == '1'
|
||||
}
|
||||
|
||||
// -- Key-Managed (Persistent, Named) Methods --
|
||||
|
||||
// keygen generates and persists a named encryption keypair.
|
||||
pub fn (mut self HeroCrypt) keygen(name string) ![]string {
|
||||
return self.redis_client.cmd_list_str('AGE', ['KEYGEN', name])
|
||||
}
|
||||
|
||||
// encrypt_by_name encrypts a message using a named key.
|
||||
pub fn (mut self HeroCrypt) encrypt_by_name(name string, message string) !string {
|
||||
return self.redis_client.cmd_str('AGE', ['ENCRYPTNAME', name, message])
|
||||
}
|
||||
|
||||
// decrypt_by_name decrypts a ciphertext using a named key.
|
||||
pub fn (mut self HeroCrypt) decrypt_by_name(name string, ciphertext_b64 string) !string {
|
||||
return self.redis_client.cmd_str('AGE', ['DECRYPTNAME', name, ciphertext_b64])
|
||||
}
|
||||
|
||||
// sign_keygen generates and persists a named signing keypair.
|
||||
pub fn (mut self HeroCrypt) sign_keygen(name string) ![]string {
|
||||
return self.redis_client.cmd_list_str('AGE', ['SIGNKEYGEN', name])
|
||||
}
|
||||
|
||||
// sign_by_name signs a message using a named signing key.
|
||||
pub fn (mut self HeroCrypt) sign_by_name(name string, message string) !string {
|
||||
return self.redis_client.cmd_str('AGE', ['SIGNNAME', name, message])
|
||||
}
|
||||
|
||||
// verify_by_name verifies a signature with a named verification key.
|
||||
pub fn (mut self HeroCrypt) verify_by_name(name string, message string, signature_b64 string) !bool {
|
||||
res := self.redis_client.cmd_str('AGE', ['VERIFYNAME', name, message, signature_b64])!
|
||||
return res == '1'
|
||||
}
|
||||
|
||||
// // list_keys lists all stored AGE keys.
|
||||
// pub fn (mut self HeroCrypt) list_keys() ![]string {
|
||||
// return self.redis_client.cmd_list_str('AGE', ['LIST'])
|
||||
// }
|
||||
116
lib/crypt/herocrypt/readme.md
Normal file
116
lib/crypt/herocrypt/readme.md
Normal file
@@ -0,0 +1,116 @@
|
||||
# HeroCrypt: Effortless Cryptography with HeroDB
|
||||
|
||||
HeroCrypt is a library that provides a simple and secure way to handle encryption and digital signatures by leveraging the power of [HeroDB](https://git.ourworld.tf/herocode/herodb). It abstracts the underlying complexity of cryptographic operations, allowing you to secure your application's data with minimal effort.
|
||||
|
||||
## What is HeroDB?
|
||||
|
||||
HeroDB is a high-performance, Redis-compatible database that offers built-in support for advanced cryptographic operations using the [AGE encryption format](https://age-encryption.org/). This integration makes it an ideal backend for applications requiring robust, end-to-end security.
|
||||
|
||||
## Core Features of HeroCrypt
|
||||
|
||||
- **Encryption & Decryption**: Securely encrypt and decrypt data.
|
||||
- **Digital Signatures**: Sign and verify messages to ensure their integrity and authenticity.
|
||||
- **Flexible Key Management**: Choose between two modes for managing your cryptographic keys:
|
||||
1. **Key-Managed (Server-Side)**: Let HeroDB manage your keys. Keys are stored securely within the database and are referenced by a name. This is the recommended approach for simplicity and centralized key management.
|
||||
2. **Stateless (Client-Side)**: Manage your own keys on the client side. You pass the key material with every cryptographic operation. This mode is for advanced users who require full control over their keys.
|
||||
|
||||
|
||||
## To start a db see
|
||||
|
||||
https://git.ourworld.tf/herocode/herodb
|
||||
|
||||
to do:
|
||||
|
||||
```bash
|
||||
hero git pull https://git.ourworld.tf/herocode/herodb
|
||||
~/code/git.ourworld.tf/herocode/herodb/run.sh
|
||||
```
|
||||
|
||||
## Key-Managed Mode (Recommended)
|
||||
|
||||
In this mode, HeroDB generates and stores the keypairs for you. You only need to provide a name for your key.
|
||||
|
||||
### Encryption
|
||||
|
||||
```v
|
||||
import freeflowuniverse.herolib.crypt.herocrypt
|
||||
|
||||
mut client := herocrypt.new_default()!
|
||||
|
||||
// Generate and persist a named encryption keypair
|
||||
client.keygen('my_app_key')!
|
||||
|
||||
// Encrypt a message
|
||||
message := 'This is a secret message.'
|
||||
ciphertext := client.encrypt_by_name('my_app_key', message)!
|
||||
|
||||
// Decrypt the message
|
||||
decrypted_message := client.decrypt_by_name('my_app_key', ciphertext)!
|
||||
assert decrypted_message == message
|
||||
```
|
||||
|
||||
### Signing
|
||||
|
||||
```v
|
||||
import freeflowuniverse.herolib.crypt.herocrypt
|
||||
|
||||
mut client := herocrypt.new_default()!
|
||||
|
||||
// Generate and persist a named signing keypair
|
||||
client.sign_keygen('my_signer_key')!
|
||||
|
||||
// Sign a message
|
||||
message := 'This message needs to be signed.'
|
||||
signature := client.sign_by_name('my_signer_key', message)!
|
||||
|
||||
// Verify the signature
|
||||
is_valid := client.verify_by_name('my_signer_key', message, signature)!
|
||||
assert is_valid
|
||||
```
|
||||
|
||||
## Stateless Mode (Advanced)
|
||||
|
||||
In this mode, you are responsible for generating and managing your own keys.
|
||||
|
||||
### Encryption
|
||||
|
||||
```v
|
||||
import freeflowuniverse.herolib.crypt.herocrypt
|
||||
|
||||
mut client := herocrypt.new_default()!
|
||||
|
||||
// Generate an ephemeral encryption keypair
|
||||
keypair := client.gen_enc_keypair()!
|
||||
recipient_pub := keypair[0]
|
||||
identity_sec := keypair[1]
|
||||
|
||||
// Encrypt a message
|
||||
message := 'This is a secret message.'
|
||||
ciphertext := client.encrypt(recipient_pub, message)!
|
||||
|
||||
// Decrypt the message
|
||||
decrypted_message := client.decrypt(identity_sec, ciphertext)!
|
||||
assert decrypted_message == message
|
||||
```
|
||||
|
||||
### Signing
|
||||
|
||||
```v
|
||||
import freeflowuniverse.herolib.crypt.herocrypt
|
||||
|
||||
mut client := herocrypt.new_default()!
|
||||
|
||||
// Generate an ephemeral signing keypair
|
||||
keypair := client.gen_sign_keypair()!
|
||||
verify_pub_b64 := keypair[0]
|
||||
sign_sec_b64 := keypair[1]
|
||||
|
||||
// Sign a message
|
||||
message := 'This message needs to be signed.'
|
||||
signature := client.sign(sign_sec_b64, message)!
|
||||
|
||||
// Verify the signature
|
||||
is_valid := client.verify(verify_pub_b64, message, signature)!
|
||||
assert is_valid
|
||||
|
||||
```
|
||||
84
lib/crypt/herocrypt/specs.md
Normal file
84
lib/crypt/herocrypt/specs.md
Normal file
@@ -0,0 +1,84 @@
|
||||
# HeroCrypt Technical Specifications
|
||||
|
||||
This document provides the low-level technical details for the HeroCrypt library, including the underlying Redis commands used to interact with HeroDB's AGE cryptography features.
|
||||
|
||||
## Communication Protocol
|
||||
|
||||
HeroCrypt communicates with HeroDB using the standard Redis protocol. All commands are sent as RESP (Redis Serialization Protocol) arrays.
|
||||
|
||||
## Stateless (Ephemeral) AGE Operations
|
||||
|
||||
These commands are used when the client manages its own keys.
|
||||
|
||||
### Encryption
|
||||
|
||||
- **`AGE GENENC`**: Generates an ephemeral encryption keypair.
|
||||
- **Returns**: An array containing the recipient (public key) and the identity (secret key).
|
||||
|
||||
- **`AGE ENCRYPT <recipient> <message>`**: Encrypts a message using the recipient's public key.
|
||||
- **`<recipient>`**: The public key (`age1...`).
|
||||
- **`<message>`**: The plaintext message to encrypt.
|
||||
- **Returns**: A base64-encoded ciphertext.
|
||||
|
||||
- **`AGE DECRYPT <identity> <ciphertext>`**: Decrypts a ciphertext using the identity (secret key).
|
||||
- **`<identity>`**: The secret key (`AGE-SECRET-KEY-1...`).
|
||||
- **`<ciphertext>`**: The base64-encoded ciphertext.
|
||||
- **Returns**: The decrypted plaintext message.
|
||||
|
||||
### Signing
|
||||
|
||||
- **`AGE GENSIGN`**: Generates an ephemeral signing keypair.
|
||||
- **Returns**: An array containing the public verification key (base64) and the secret signing key (base64).
|
||||
|
||||
- **`AGE SIGN <sign_secret_b64> <message>`**: Signs a message with the secret key.
|
||||
- **`<sign_secret_b64>`**: The base64-encoded secret signing key.
|
||||
- **`<message>`**: The message to sign.
|
||||
- **Returns**: A base64-encoded signature.
|
||||
|
||||
- **`AGE VERIFY <verify_pub_b64> <message> <signature_b64>`**: Verifies a signature.
|
||||
- **`<verify_pub_b64>`**: The base64-encoded public verification key.
|
||||
- **`<message>`**: The original message.
|
||||
- **`<signature_b64>`**: The base64-encoded signature.
|
||||
- **Returns**: `1` if the signature is valid, `0` otherwise.
|
||||
|
||||
## Key-Managed (Persistent, Named) AGE Operations
|
||||
|
||||
These commands are used when HeroDB manages the keys.
|
||||
|
||||
### Encryption
|
||||
|
||||
- **`AGE KEYGEN <name>`**: Generates and persists a named encryption keypair.
|
||||
- **`<name>`**: A unique name for the key.
|
||||
- **Returns**: An array containing the recipient and identity (for initial export, if needed).
|
||||
|
||||
- **`AGE ENCRYPTNAME <name> <message>`**: Encrypts a message using a named key.
|
||||
- **`<name>`**: The name of the stored key.
|
||||
- **`<message>`**: The plaintext message.
|
||||
- **Returns**: A base64-encoded ciphertext.
|
||||
|
||||
- **`AGE DECRYPTNAME <name> <ciphertext>`**: Decrypts a ciphertext using a named key.
|
||||
- **`<name>`**: The name of the stored key.
|
||||
- **`<ciphertext>`**: The base64-encoded ciphertext.
|
||||
- **Returns**: The decrypted plaintext message.
|
||||
|
||||
### Signing
|
||||
|
||||
- **`AGE SIGNKEYGEN <name>`**: Generates and persists a named signing keypair.
|
||||
- **`<name>`**: A unique name for the key.
|
||||
- **Returns**: An array containing the public and secret keys (for initial export).
|
||||
|
||||
- **`AGE SIGNNAME <name> <message>`**: Signs a message using a named key.
|
||||
- **`<name>`**: The name of the stored key.
|
||||
- **`<message>`**: The message to sign.
|
||||
- **Returns**: A base64-encoded signature.
|
||||
|
||||
- **`AGE VERIFYNAME <name> <message> <signature_b64>`**: Verifies a signature using a named key.
|
||||
- **`<name>`**: The name of the stored key.
|
||||
- **`<message>`**: The original message.
|
||||
- **`<signature_b64>`**: The base64-encoded signature.
|
||||
- **Returns**: `1` if the signature is valid, `0` otherwise.
|
||||
|
||||
### Key Listing
|
||||
|
||||
- **`AGE LIST`**: Lists all stored AGE keys.
|
||||
- **Returns**: An array of key names.
|
||||
@@ -1,17 +0,0 @@
|
||||
security add-internet-password -s git.threefold.info -a despiegk -w mypassword
|
||||
|
||||
-s: The server (e.g., git.threefold.info).
|
||||
-a: The account or username (e.g., despiegk).
|
||||
-w: The password (e.g., mypassword).
|
||||
|
||||
|
||||
security find-internet-password -s git.threefold.info -w
|
||||
|
||||
|
||||
security delete-internet-password -s git.threefold.info
|
||||
|
||||
|
||||
|
||||
git config --global credential.helper osxkeychain
|
||||
|
||||
|
||||
@@ -1,202 +0,0 @@
|
||||
module keysafe
|
||||
|
||||
import freeflowuniverse.herolib.core.pathlib
|
||||
// import freeflowuniverse.herolib.data.mnemonic // buggy for now
|
||||
import encoding.hex
|
||||
import libsodium
|
||||
import json
|
||||
import os
|
||||
import freeflowuniverse.herolib.ui.console
|
||||
|
||||
/*
|
||||
* KeysSafe
|
||||
*
|
||||
* This module implement a secure keys manager.
|
||||
*
|
||||
* When loading a keysafe object, you can specify a directory and a secret.
|
||||
* In that directory, a file called '.keys' will be created and encrypted using
|
||||
* the 'secret' provided (AES-CBC).
|
||||
*
|
||||
* Content of that file is a JSON dictionnary of key-name and it's mnemonic,
|
||||
* a single mnemonic is enough to derivate ed25519 and x25519 keys.
|
||||
*
|
||||
* When loaded, private/public signing key and public/private encryption keys
|
||||
* are loaded and ready to be used.
|
||||
*
|
||||
* key_generate_add() generate a new key and store is as specified name
|
||||
* key_import_add() import an existing key based on it's seed and specified name
|
||||
*
|
||||
*/
|
||||
|
||||
pub struct KeysSafe {
|
||||
pub mut:
|
||||
path pathlib.Path // file path of keys
|
||||
loaded bool // flag to know if keysafe is loaded or loading
|
||||
secret string // secret to encrypt local file
|
||||
keys map[string]PrivKey // list of keys
|
||||
}
|
||||
|
||||
pub struct PersistantKeysSafe {
|
||||
pub mut:
|
||||
keys map[string]string // store name/mnemonics only
|
||||
}
|
||||
|
||||
// note: root key needs to be 'SigningKey' from libsodium
|
||||
// from that SigningKey we can derivate PrivateKey needed to encrypt
|
||||
|
||||
pub fn keysafe_get(path0 string, secret string) !KeysSafe {
|
||||
mut path := pathlib.get_file(path: path0 + '/.keys', create: true)!
|
||||
mut safe := KeysSafe{
|
||||
path: path
|
||||
secret: secret
|
||||
}
|
||||
|
||||
if os.exists(path.absolute()) {
|
||||
console.print_debug('[+] key file already exists, loading it')
|
||||
safe.load()
|
||||
}
|
||||
|
||||
safe.loaded = true
|
||||
|
||||
return safe
|
||||
}
|
||||
|
||||
// for testing purposes you can generate multiple keys
|
||||
pub fn (mut ks KeysSafe) generate_multiple(count int) ! {
|
||||
for i in 0 .. count {
|
||||
ks.key_generate_add('name_${i}')!
|
||||
}
|
||||
}
|
||||
|
||||
// generate a new key is just importing a key with a random seed
|
||||
pub fn (mut ks KeysSafe) key_generate_add(name string) !PrivKey {
|
||||
mut seed := []u8{}
|
||||
|
||||
// generate a new random seed
|
||||
for _ in 0 .. 32 {
|
||||
seed << u8(libsodium.randombytes_random())
|
||||
}
|
||||
|
||||
return ks.key_import_add(name, seed)
|
||||
}
|
||||
|
||||
fn internal_key_encode(key []u8) string {
|
||||
return '0x' + hex.encode(key)
|
||||
}
|
||||
|
||||
fn internal_key_decode(key string) []u8 {
|
||||
parsed := hex.decode(key.substr(2, key.len)) or { panic(err) }
|
||||
return parsed
|
||||
}
|
||||
|
||||
// import based on an existing seed
|
||||
pub fn (mut ks KeysSafe) key_import_add(name string, seed []u8) !PrivKey {
|
||||
if name in ks.keys {
|
||||
return error('A key with that name already exists')
|
||||
}
|
||||
|
||||
mnemonic := internal_key_encode(seed) // mnemonic(seed)
|
||||
signkey := libsodium.new_ed25519_signing_key_seed(seed)
|
||||
privkey := libsodium.new_private_key_from_signing_ed25519(signkey)
|
||||
|
||||
// console.print_debug("===== SEED ====")
|
||||
// console.print_debug(seed)
|
||||
// console.print_debug(mnemonic)
|
||||
|
||||
pk := PrivKey{
|
||||
name: name
|
||||
mnemonic: mnemonic
|
||||
privkey: privkey
|
||||
signkey: signkey
|
||||
}
|
||||
|
||||
ks.key_add(pk)!
|
||||
return pk
|
||||
}
|
||||
|
||||
pub fn (mut ks KeysSafe) get(name string) !PrivKey {
|
||||
if !ks.exists(name) {
|
||||
return error('key not found')
|
||||
}
|
||||
|
||||
return ks.keys[name]
|
||||
}
|
||||
|
||||
pub fn (mut ks KeysSafe) exists(name string) bool {
|
||||
return name in ks.keys
|
||||
}
|
||||
|
||||
pub fn (mut ks KeysSafe) key_add(pk PrivKey) ! {
|
||||
ks.keys[pk.name] = pk
|
||||
|
||||
// do not persist keys if keysafe is not loaded
|
||||
// this mean we are probably loading keys from file
|
||||
if ks.loaded {
|
||||
ks.persist()
|
||||
}
|
||||
}
|
||||
|
||||
pub fn (mut ks KeysSafe) persist() {
|
||||
console.print_debug('[+] saving keys to ${ks.path.absolute()}')
|
||||
serialized := ks.serialize()
|
||||
// console.print_debug(serialized)
|
||||
|
||||
encrypted := symmetric_encrypt_blocks(serialized.bytes(), ks.secret)
|
||||
|
||||
mut f := os.create(ks.path.absolute()) or { panic(err) }
|
||||
f.write(encrypted) or { panic(err) }
|
||||
f.close()
|
||||
}
|
||||
|
||||
pub fn (mut ks KeysSafe) serialize() string {
|
||||
mut pks := PersistantKeysSafe{}
|
||||
|
||||
// serializing mnemonics only
|
||||
for key, val in ks.keys {
|
||||
pks.keys[key] = val.mnemonic
|
||||
}
|
||||
|
||||
export := json.encode(pks)
|
||||
|
||||
return export
|
||||
}
|
||||
|
||||
pub fn (mut ks KeysSafe) load() {
|
||||
console.print_debug('[+] loading keys from ${ks.path.absolute()}')
|
||||
|
||||
mut f := os.open(ks.path.absolute()) or { panic(err) }
|
||||
|
||||
// read encrypted file
|
||||
filesize := os.file_size(ks.path.absolute())
|
||||
mut encrypted := []u8{len: int(filesize)}
|
||||
|
||||
f.read(mut encrypted) or { panic(err) }
|
||||
f.close()
|
||||
|
||||
// decrypt file using ks secret
|
||||
plaintext := symmetric_decrypt_blocks(encrypted, ks.secret)
|
||||
|
||||
// (try to) decode the json and load keys
|
||||
ks.deserialize(plaintext.bytestr())
|
||||
}
|
||||
|
||||
pub fn (mut ks KeysSafe) deserialize(input string) {
|
||||
mut pks := json.decode(PersistantKeysSafe, input) or {
|
||||
console.print_debug('Failed to decode json, wrong secret or corrupted file: ${err}')
|
||||
return
|
||||
}
|
||||
|
||||
// serializing mnemonics only
|
||||
for name, mnemo in pks.keys {
|
||||
console.print_debug('[+] loading key: ${name}')
|
||||
seed := internal_key_decode(mnemo) // mnemonic.parse(mnemo)
|
||||
|
||||
// console.print_debug("==== SEED ====")
|
||||
// console.print_debug(mnemo)
|
||||
// console.print_debug(seed)
|
||||
|
||||
ks.key_import_add(name, seed) or { panic(err) }
|
||||
}
|
||||
|
||||
// console.print_debug(ks)
|
||||
}
|
||||
@@ -1,45 +0,0 @@
|
||||
module keysafe
|
||||
|
||||
import libsodium
|
||||
import encoding.hex
|
||||
|
||||
pub struct PrivKey {
|
||||
pub:
|
||||
name string
|
||||
mnemonic string
|
||||
signkey libsodium.SigningKey // master key
|
||||
privkey libsodium.PrivateKey // derivated from master key
|
||||
}
|
||||
|
||||
pub fn key_encode(key []u8) string {
|
||||
return '0x' + hex.encode(key)
|
||||
}
|
||||
|
||||
// retrieve the master public key from PrivKey object
|
||||
// this is the public key as need to be shared to a remote user to verify that we signed with our private key
|
||||
// is shared as hex key in string format (66 chars)
|
||||
pub fn (key PrivKey) master_public() string {
|
||||
x := key.signkey.verify_key.public_key
|
||||
|
||||
mut target := []u8{len: x.len}
|
||||
unsafe { C.memcpy(target.data, &x[0], x.len) }
|
||||
|
||||
return key_encode(target)
|
||||
}
|
||||
|
||||
// sign data with our signing key
|
||||
// data is bytestr
|
||||
// output is []u8 bytestring
|
||||
// to get bytes from string do: mystring.bytes().
|
||||
pub fn (key PrivKey) sign(data []u8) []u8 {
|
||||
return key.signkey.sign(data)
|
||||
}
|
||||
|
||||
// sign data with our signing key.
|
||||
// data is bytestr.
|
||||
// output is hex string.
|
||||
// to get bytes from string do: mystring.bytes().
|
||||
// size of output is ?
|
||||
pub fn (key PrivKey) sign_hex(data []u8) string {
|
||||
return hex.encode(key.sign(data))
|
||||
}
|
||||
@@ -1,41 +0,0 @@
|
||||
module keysafe
|
||||
|
||||
import libsodium
|
||||
import encoding.hex
|
||||
|
||||
pub struct PubKey {
|
||||
pub:
|
||||
name string
|
||||
source PrivKey // ourself (private key, to sign message)
|
||||
remote []u8 // target public key (to encrypt)
|
||||
}
|
||||
|
||||
pub fn pubkey_new(name string, myself PrivKey, remote string) !PubKey {
|
||||
parsed := hex.decode(remote.substr(2, remote.len))!
|
||||
|
||||
// convert SigningKey to PrivateKey (ed25519 > x25519)
|
||||
// to allow encryption and decryption
|
||||
|
||||
target := []u8{len: libsodium.public_key_size}
|
||||
libsodium.crypto_sign_ed25519_pk_to_curve25519(target.data, parsed[0])
|
||||
|
||||
return PubKey{
|
||||
name: name
|
||||
source: myself
|
||||
remote: target
|
||||
}
|
||||
}
|
||||
|
||||
// this will encrypt bytes so that only the owner of this pubkey can decrypt it
|
||||
pub fn (key PubKey) encrypt(data []u8) ![]u8 {
|
||||
box := libsodium.new_box(key.source.privkey, key.remote)
|
||||
return box.encrypt(data)
|
||||
}
|
||||
|
||||
// verify a signed data
|
||||
pub fn (key PubKey) decrypt(data []u8) []u8 {
|
||||
box := libsodium.new_box(key.source.privkey, key.remote)
|
||||
decrypted := box.decrypt(data)
|
||||
|
||||
return decrypted
|
||||
}
|
||||
@@ -1,47 +0,0 @@
|
||||
# Keysafe
|
||||
|
||||
A safe implementation to help you sign, encrypt, decrypt and store your keys locally.
|
||||
|
||||
## Internals
|
||||
|
||||
When loading a keysafe object, you can specify a `directory` and a `secret`.
|
||||
In that directory, a file called `.keys` will be created and encrypted using
|
||||
the `secret` provided (`AES-CBC`).
|
||||
|
||||
Content of that file is a JSON dictionnary of key-name and it's mnemonic,
|
||||
a single mnemonic is enough to derivate `ed25519` and `x25519` keys.
|
||||
|
||||
When loaded, private/public signing key and public/private encryption keys
|
||||
are loaded and ready to be used.
|
||||
|
||||
- `key_generate_add()` generate a new key and store is as specified name
|
||||
- `key_import_add()` import an existing key based on it's seed and specified name
|
||||
|
||||
## Example
|
||||
|
||||
```v
|
||||
module main
|
||||
|
||||
import freeflowuniverse.herolib.crypt.keysafe
|
||||
|
||||
fn main() {
|
||||
mut ks := keysafe.keysafe_get("/tmp/", "helloworld")!
|
||||
println(ks)
|
||||
|
||||
ks.key_generate_add("demo") or { println(err) }
|
||||
println(ks)
|
||||
|
||||
if ks.exists("demo") {
|
||||
println("key demo exists")
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Keys
|
||||
|
||||
Note about keys: when generating a new key, the "master key" is a SigningKey Ed25519 key. From
|
||||
that key, we can derivate a PrivateKey (encrypting key) X25519.
|
||||
|
||||
We can convert public-key only as well. On public key exchange, please always exchange the public SigningKey
|
||||
(aka the master key for us). Based on that SignigKey, we can derivate the Encyption PublicKey and KeysSafe
|
||||
does it for you.
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user