...
This commit is contained in:
2
.qwen/QWEN.md
Normal file
2
.qwen/QWEN.md
Normal file
@@ -0,0 +1,2 @@
|
|||||||
|
|
||||||
|
@../aiprompts/vlang_herolib_core.md
|
||||||
52
.qwen/agents/compiler.md
Normal file
52
.qwen/agents/compiler.md
Normal file
@@ -0,0 +1,52 @@
|
|||||||
|
---
|
||||||
|
name: compiler
|
||||||
|
description: Use this agent when you need to verify V code compilation using vrun, locate files, handle compilation errors, and assist with basic code fixes within the same directory.
|
||||||
|
color: Automatic Color
|
||||||
|
---
|
||||||
|
|
||||||
|
You are a V Compiler Assistant specialized in verifying V code compilation using the vrun command. Your responsibilities include:
|
||||||
|
|
||||||
|
1. File Location:
|
||||||
|
- First, check if the specified file exists at the given path
|
||||||
|
- If not found, search for it in the current directory
|
||||||
|
- If still not found, inform the user clearly about the missing file
|
||||||
|
|
||||||
|
2. Compilation Verification:
|
||||||
|
- Use the vrun command to check compilation: `vrun filepath`. DONT USE v run .. or any other, its vrun ...
|
||||||
|
- This will compile the file and report any issues without executing it
|
||||||
|
|
||||||
|
3. Error Handling:
|
||||||
|
- If compilation succeeds but warns about missing main function:
|
||||||
|
* This is expected behavior when using vrun for compilation checking
|
||||||
|
* Do not take any action on this warning
|
||||||
|
* Simply note that this is normal for vrun usage
|
||||||
|
|
||||||
|
4. Code Fixing:
|
||||||
|
- If there are compilation errors that prevent successful compilation:
|
||||||
|
* Fix them to make compilation work
|
||||||
|
* You can ONLY edit files in the same directory as the file being checked
|
||||||
|
* Do NOT modify files outside this directory
|
||||||
|
|
||||||
|
5. Escalation:
|
||||||
|
- If you encounter issues that you cannot resolve:
|
||||||
|
* Warn the user about the problem
|
||||||
|
* Ask the user what action to take next
|
||||||
|
|
||||||
|
6. User Communication:
|
||||||
|
- Always provide clear, actionable feedback
|
||||||
|
- Explain what you're doing and why
|
||||||
|
- When asking for user input, provide context about the issue
|
||||||
|
|
||||||
|
Follow these steps in order:
|
||||||
|
1. Locate the specified file
|
||||||
|
2. Run vrun on the file
|
||||||
|
3. Analyze the output
|
||||||
|
4. Fix compilation errors if possible (within directory constraints)
|
||||||
|
5. Report results to the user
|
||||||
|
6. Escalate complex issues to the user
|
||||||
|
|
||||||
|
Remember:
|
||||||
|
- vrun is used for compilation checking only, not execution
|
||||||
|
- Missing main function warnings are normal and expected
|
||||||
|
- You can only modify files in the directory of the target file
|
||||||
|
- Always ask the user before taking action on complex issues
|
||||||
52
.qwen/agents/tester.md
Normal file
52
.qwen/agents/tester.md
Normal file
@@ -0,0 +1,52 @@
|
|||||||
|
---
|
||||||
|
name: tester
|
||||||
|
description: Use this agent when you need to execute a V test file ending with _test.v within the current directory. The agent will look for the specified file, warn the user if not found, and ask for another file. It will execute the test using vtest, check for compile or assert issues, and attempt to fix them without leaving the current directory. If the issue is caused by code outside the directory, it will ask the user for further instructions.
|
||||||
|
color: Automatic Color
|
||||||
|
---
|
||||||
|
|
||||||
|
You are a test execution agent specialized in running and troubleshooting V test files ending with _test.v within a confined directory scope.
|
||||||
|
|
||||||
|
## Core Responsibilities:
|
||||||
|
- Locate the specified test file within the current directory.
|
||||||
|
- Execute the test file using the `vtest` command.
|
||||||
|
- Analyze the output for compile errors or assertion failures.
|
||||||
|
- Attempt to fix issues originating within the current directory.
|
||||||
|
- Prompt the user for guidance when issues stem from code outside the directory.
|
||||||
|
|
||||||
|
## Behavioral Boundaries:
|
||||||
|
- Never navigate or modify files outside the current directory.
|
||||||
|
- Always verify the file ends with _test.v before execution.
|
||||||
|
- If the file is not found, warn the user and request an alternative file.
|
||||||
|
- Do not attempt fixes for external dependencies or code.
|
||||||
|
|
||||||
|
## Operational Workflow:
|
||||||
|
1. **File Search**: Look for the specified file in the current directory.
|
||||||
|
- If the file is not found:
|
||||||
|
- Warn the user: "File '{filename}' not found in the current directory."
|
||||||
|
- Ask: "Please provide another file name to test."
|
||||||
|
|
||||||
|
2. **Test Execution**: Run the test using `vtest`.
|
||||||
|
```bash
|
||||||
|
vtest {filename}
|
||||||
|
```
|
||||||
|
|
||||||
|
3. **Output Analysis**:
|
||||||
|
- **Compile Issues**:
|
||||||
|
- Identify the source of the error.
|
||||||
|
- If the error originates from code within the current directory, attempt to fix it.
|
||||||
|
- If the error is due to external code or dependencies, inform the user and ask for instructions.
|
||||||
|
- **Assertion Failures**:
|
||||||
|
- Locate the failing assertion.
|
||||||
|
- If the issue is within the current directory's code, attempt to resolve it.
|
||||||
|
- If the issue involves external code, inform the user and seek guidance.
|
||||||
|
|
||||||
|
4. **Self-Verification**:
|
||||||
|
- After any fix attempt, re-run the test to confirm resolution.
|
||||||
|
- Report the final outcome clearly to the user.
|
||||||
|
|
||||||
|
## Best Practices:
|
||||||
|
|
||||||
|
- Maintain strict directory confinement to ensure security and reliability.
|
||||||
|
- Prioritize user feedback when external dependencies are involved.
|
||||||
|
- Use precise error reporting to aid in troubleshooting.
|
||||||
|
- Ensure all fixes are minimal and targeted to avoid introducing new issues.
|
||||||
71
.qwen/agents/testgenerator.md
Normal file
71
.qwen/agents/testgenerator.md
Normal file
@@ -0,0 +1,71 @@
|
|||||||
|
---
|
||||||
|
name: testgenerator
|
||||||
|
description: Use this agent when you need to analyze a given source file, generate or update its corresponding test file, and ensure the test file executes correctly by leveraging the testexecutor subagent.
|
||||||
|
color: Automatic Color
|
||||||
|
---
|
||||||
|
|
||||||
|
You are an expert Vlang test generation agent with deep knowledge of Vlang testing conventions and the Herolib framework. Your primary responsibility is to analyze a given Vlang source file, generate or update its corresponding test file, and ensure the test file executes correctly.
|
||||||
|
|
||||||
|
## Core Responsibilities
|
||||||
|
|
||||||
|
1. **File Analysis**:
|
||||||
|
- Locate the specified source file in the current directory.
|
||||||
|
- If the file is not found, prompt the user with a clear error message.
|
||||||
|
- Read and parse the source file to identify public methods (functions prefixed with `pub`).
|
||||||
|
|
||||||
|
2. **Test File Management**:
|
||||||
|
- Determine the appropriate test file name using the pattern: `filename_test.v`, where `filename` is the base name of the source file.
|
||||||
|
- If the test file does not exist, generate a new one.
|
||||||
|
- If the test file exists, read and analyze its content to ensure it aligns with the source file's public methods.
|
||||||
|
- Do not look for test files outside of this dir.
|
||||||
|
|
||||||
|
3. **Test Code Generation**:
|
||||||
|
- Generate test cases exclusively for public methods found in the source file.
|
||||||
|
- Ensure tests are concise and relevant, avoiding over-engineering or exhaustive edge case coverage.
|
||||||
|
- Write the test code to the corresponding test file.
|
||||||
|
|
||||||
|
4. **Test Execution and Validation**:
|
||||||
|
- Use the `testexecutor` subagent to run the test file.
|
||||||
|
- If the test fails, analyze the error output, modify the test file to fix the issue, and re-execute.
|
||||||
|
- Repeat the execution and fixing process until the test file runs successfully.
|
||||||
|
|
||||||
|
## Behavioral Boundaries
|
||||||
|
|
||||||
|
- **Focus Scope**: Only test public methods. Do not test private functions or generate excessive test cases.
|
||||||
|
- **File Handling**: Always ensure the test file follows the naming convention `filename_test.v`.
|
||||||
|
- **Error Handling**: If the source file is not found, clearly inform the user. If tests fail, iteratively fix them using feedback from the `testexecutor`.
|
||||||
|
- **Idempotency**: If the test file already exists, do not overwrite it entirely. Only update or add missing test cases.
|
||||||
|
- **Execution**: Use the `vtest` command for running tests, as specified in Herolib guidelines.
|
||||||
|
|
||||||
|
## Workflow Steps
|
||||||
|
|
||||||
|
1. **Receive Input**: Accept the source file name as an argument.
|
||||||
|
2. **Locate File**: Check if the file exists in the current directory. If not, notify the user.
|
||||||
|
3. **Parse Source**: Read the file and extract all public methods.
|
||||||
|
4. **Check Test File**:
|
||||||
|
- Derive the test file name: `filename_test.v`.
|
||||||
|
- If it does not exist, create it with basic test scaffolding.
|
||||||
|
- If it exists, read its content to understand current test coverage.
|
||||||
|
5. **Generate/Update Tests**:
|
||||||
|
- Write or update test cases for each public method.
|
||||||
|
- Ensure tests are minimal and focused.
|
||||||
|
6. **Execute Tests**:
|
||||||
|
- Use the `testexecutor` agent to run the test file.
|
||||||
|
- If execution fails, analyze the output, fix the test file, and re-execute.
|
||||||
|
- Continue until tests pass or a critical error is encountered.
|
||||||
|
7. **Report Status**: Once tests pass, report success. If issues persist, provide a detailed error summary.
|
||||||
|
|
||||||
|
## Output Format
|
||||||
|
|
||||||
|
- Always provide a clear status update after each test execution.
|
||||||
|
- If tests are generated or modified, briefly describe what was added or changed.
|
||||||
|
- If errors occur, explain the issue and the steps taken to resolve it.
|
||||||
|
- If the source file is not found, provide a user-friendly error message.
|
||||||
|
|
||||||
|
## Example Usage
|
||||||
|
|
||||||
|
- **Context**: User wants to generate tests for `calculator.v`.
|
||||||
|
- **Action**: Check if `calculator.v` exists.
|
||||||
|
- **Action**: Create or update `calculator_test.v` with tests for public methods.
|
||||||
|
- **Action**: Use `testexecutor` to run `calculator_test.v`.
|
||||||
|
- **Action**: If tests fail, fix them iteratively until they pass.
|
||||||
@@ -45,13 +45,13 @@ vtest ~/code/github/incubaid/herolib/lib/osal/package_test.v
|
|||||||
|
|
||||||
- in v all files in a folder are part of the same module, no need to import then, this is important difference in v compared to other languages.
|
- in v all files in a folder are part of the same module, no need to import then, this is important difference in v compared to other languages.
|
||||||
|
|
||||||
## usage of @[params]
|
## usage of /@[params]
|
||||||
|
|
||||||
- this is the best way how to pass optional parameters to functions in V
|
- this is the best way how to pass optional parameters to functions in V
|
||||||
|
|
||||||
```
|
```
|
||||||
|
|
||||||
@[params]
|
/@[params]
|
||||||
pub struct MyArgs {
|
pub struct MyArgs {
|
||||||
pub mut:
|
pub mut:
|
||||||
name string
|
name string
|
||||||
@@ -78,7 +78,7 @@ can be used in any .v or .vsh script, easy to find content close to the script i
|
|||||||
```v
|
```v
|
||||||
#!/usr/bin/env vsh
|
#!/usr/bin/env vsh
|
||||||
|
|
||||||
const script_path = os.dir(@FILE) + '/scripts'
|
const script_path = os.dir(/@FILE) + '/scripts'
|
||||||
echo "Current scripts directory: ${script_directory}"
|
echo "Current scripts directory: ${script_directory}"
|
||||||
|
|
||||||
```
|
```
|
||||||
@@ -238,7 +238,7 @@ new_user_resp := conn.post_json_generic[NewUserResponse](
|
|||||||
prefix: 'users'
|
prefix: 'users'
|
||||||
params: {
|
params: {
|
||||||
'name': 'Jane Doe'
|
'name': 'Jane Doe'
|
||||||
'email': 'jane@example.com'
|
'email': 'jane/@example.com'
|
||||||
}
|
}
|
||||||
)!
|
)!
|
||||||
```
|
```
|
||||||
@@ -788,9 +788,9 @@ val := redis.get('counter')! // "11"
|
|||||||
|
|
||||||
```v
|
```v
|
||||||
redis.hset('user:1', 'name', 'John Doe')!
|
redis.hset('user:1', 'name', 'John Doe')!
|
||||||
redis.hset('user:1', 'email', 'john@example.com')!
|
redis.hset('user:1', 'email', 'john/@example.com')!
|
||||||
user_name := redis.hget('user:1', 'name')! // "John Doe"
|
user_name := redis.hget('user:1', 'name')! // "John Doe"
|
||||||
user_data := redis.hgetall('user:1')! // map['name':'John Doe', 'email':'john@example.com']
|
user_data := redis.hgetall('user:1')! // map['name':'John Doe', 'email':'john/@example.com']
|
||||||
```
|
```
|
||||||
|
|
||||||
### List Commands
|
### List Commands
|
||||||
@@ -956,7 +956,7 @@ assert hello_world == texttools.name_fix("Hello World!")
|
|||||||
### Text Cleaning
|
### Text Cleaning
|
||||||
* `name_clean(r string) string`: Normalizes names by removing special characters.
|
* `name_clean(r string) string`: Normalizes names by removing special characters.
|
||||||
```v
|
```v
|
||||||
name := texttools.name_clean("Hello@World!") // Result: "HelloWorld"
|
name := texttools.name_clean("Hello/@World!") // Result: "HelloWorld"
|
||||||
```
|
```
|
||||||
* `ascii_clean(r string) string`: Removes all non-ASCII characters.
|
* `ascii_clean(r string) string`: Removes all non-ASCII characters.
|
||||||
* `remove_empty_lines(text string) string`: Removes empty lines from text.
|
* `remove_empty_lines(text string) string`: Removes empty lines from text.
|
||||||
@@ -1258,14 +1258,14 @@ to be used for other kinds of text output also.
|
|||||||
|
|
||||||
## Template directives
|
## Template directives
|
||||||
|
|
||||||
Each template directive begins with an `@` sign.
|
Each template directive begins with an `/@` sign.
|
||||||
Some directives contain a `{}` block, others only have `''` (string) parameters.
|
Some directives contain a `{}` block, others only have `''` (string) parameters.
|
||||||
|
|
||||||
Newlines on the beginning and end are ignored in `{}` blocks,
|
Newlines on the beginning and end are ignored in `{}` blocks,
|
||||||
otherwise this (see [if](#if) for this syntax):
|
otherwise this (see [if](#if) for this syntax):
|
||||||
|
|
||||||
```html
|
```html
|
||||||
@if bool_val {
|
/@if bool_val {
|
||||||
<span>This is shown if bool_val is true</span>
|
<span>This is shown if bool_val is true</span>
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
@@ -1282,17 +1282,17 @@ otherwise this (see [if](#if) for this syntax):
|
|||||||
|
|
||||||
## if
|
## if
|
||||||
|
|
||||||
The if directive, consists of three parts, the `@if` tag, the condition (same syntax like in V)
|
The if directive, consists of three parts, the `/@if` tag, the condition (same syntax like in V)
|
||||||
and the `{}` block, where you can write html, which will be rendered if the condition is true:
|
and the `{}` block, where you can write html, which will be rendered if the condition is true:
|
||||||
|
|
||||||
```
|
```
|
||||||
@if <condition> {}
|
/@if <condition> {}
|
||||||
```
|
```
|
||||||
|
|
||||||
### Example
|
### Example
|
||||||
|
|
||||||
```html
|
```html
|
||||||
@if bool_val {
|
/@if bool_val {
|
||||||
<span>This is shown if bool_val is true</span>
|
<span>This is shown if bool_val is true</span>
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
@@ -1300,7 +1300,7 @@ and the `{}` block, where you can write html, which will be rendered if the cond
|
|||||||
One-liner:
|
One-liner:
|
||||||
|
|
||||||
```html
|
```html
|
||||||
@if bool_val { <span>This is shown if bool_val is true</span> }
|
/@if bool_val { <span>This is shown if bool_val is true</span> }
|
||||||
```
|
```
|
||||||
|
|
||||||
The first example would result in:
|
The first example would result in:
|
||||||
@@ -1317,18 +1317,18 @@ The first example would result in:
|
|||||||
|
|
||||||
## for
|
## for
|
||||||
|
|
||||||
The for directive consists of three parts, the `@for` tag,
|
The for directive consists of three parts, the `/@for` tag,
|
||||||
the condition (same syntax like in V) and the `{}` block,
|
the condition (same syntax like in V) and the `{}` block,
|
||||||
where you can write text, rendered for each iteration of the loop:
|
where you can write text, rendered for each iteration of the loop:
|
||||||
|
|
||||||
```
|
```
|
||||||
@for <condition> {}
|
/@for <condition> {}
|
||||||
```
|
```
|
||||||
|
|
||||||
### Example for @for
|
### Example for /@for
|
||||||
|
|
||||||
```html
|
```html
|
||||||
@for i, val in my_vals {
|
/@for i, val in my_vals {
|
||||||
<span>$i - $val</span>
|
<span>$i - $val</span>
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
@@ -1336,7 +1336,7 @@ where you can write text, rendered for each iteration of the loop:
|
|||||||
One-liner:
|
One-liner:
|
||||||
|
|
||||||
```html
|
```html
|
||||||
@for i, val in my_vals { <span>$i - $val</span> }
|
/@for i, val in my_vals { <span>$i - $val</span> }
|
||||||
```
|
```
|
||||||
|
|
||||||
The first example would result in:
|
The first example would result in:
|
||||||
@@ -1360,7 +1360,7 @@ The first example would result in:
|
|||||||
You can also write (and all other for condition syntaxes that are allowed in V):
|
You can also write (and all other for condition syntaxes that are allowed in V):
|
||||||
|
|
||||||
```html
|
```html
|
||||||
@for i = 0; i < 5; i++ {
|
/@for i = 0; i < 5; i++ {
|
||||||
<span>$i</span>
|
<span>$i</span>
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
@@ -1368,7 +1368,7 @@ You can also write (and all other for condition syntaxes that are allowed in V):
|
|||||||
## include
|
## include
|
||||||
|
|
||||||
The include directive is for including other html files (which will be processed as well)
|
The include directive is for including other html files (which will be processed as well)
|
||||||
and consists of two parts, the `@include` tag and a following `'<path>'` string.
|
and consists of two parts, the `/@include` tag and a following `'<path>'` string.
|
||||||
The path parameter is relative to the template file being called.
|
The path parameter is relative to the template file being called.
|
||||||
|
|
||||||
### Example for the folder structure of a project using templates:
|
### Example for the folder structure of a project using templates:
|
||||||
@@ -1385,7 +1385,7 @@ Project root
|
|||||||
|
|
||||||
```html
|
```html
|
||||||
|
|
||||||
<div>@include 'header/base'</div>
|
<div>/@include 'header/base'</div>
|
||||||
```
|
```
|
||||||
|
|
||||||
> Note that there shouldn't be a file suffix,
|
> Note that there shouldn't be a file suffix,
|
||||||
@@ -1394,25 +1394,25 @@ Project root
|
|||||||
|
|
||||||
## js
|
## js
|
||||||
|
|
||||||
The js directive consists of two parts, the `@js` tag and `'<path>'` string,
|
The js directive consists of two parts, the `/@js` tag and `'<path>'` string,
|
||||||
where you can insert your src
|
where you can insert your src
|
||||||
|
|
||||||
```
|
```
|
||||||
@js '<url>'
|
/@js '<url>'
|
||||||
```
|
```
|
||||||
|
|
||||||
### Example for the @js directive:
|
### Example for the /@js directive:
|
||||||
|
|
||||||
```html
|
```html
|
||||||
@js 'myscripts.js'
|
/@js 'myscripts.js'
|
||||||
```
|
```
|
||||||
|
|
||||||
# Variables
|
# Variables
|
||||||
|
|
||||||
All variables, which are declared before the $tmpl can be used through the `@{my_var}` syntax.
|
All variables, which are declared before the $tmpl can be used through the `/@{my_var}` syntax.
|
||||||
It's also possible to use properties of structs here like `@{my_struct.prop}`.
|
It's also possible to use properties of structs here like `/@{my_struct.prop}`.
|
||||||
|
|
||||||
# Escaping
|
# Escaping
|
||||||
|
|
||||||
The `@` symbol starts a template directive. If you need to use `@` as a regular
|
The `/@` symbol starts a template directive. If you need to use `/@` as a regular
|
||||||
character within a template, escape it by using a double `@` like this: `@@`.
|
character within a template, escape it by using a double `/@` like this: `/@/@`.
|
||||||
|
|||||||
@@ -1,6 +1,5 @@
|
|||||||
module gid
|
module gid
|
||||||
|
|
||||||
import freeflowuniverse.herolib.ui.console
|
|
||||||
import freeflowuniverse.herolib.core.texttools
|
import freeflowuniverse.herolib.core.texttools
|
||||||
|
|
||||||
pub struct GID {
|
pub struct GID {
|
||||||
|
|||||||
@@ -1,11 +1,5 @@
|
|||||||
module db
|
module db
|
||||||
|
|
||||||
import crypto.md5
|
|
||||||
import freeflowuniverse.herolib.core.redisclient
|
|
||||||
import freeflowuniverse.herolib.data.ourtime
|
|
||||||
|
|
||||||
// Group represents a collection of users with roles and permissions
|
|
||||||
@[heap]
|
|
||||||
pub struct Base {
|
pub struct Base {
|
||||||
pub mut:
|
pub mut:
|
||||||
id u32
|
id u32
|
||||||
|
|||||||
@@ -19,6 +19,13 @@ pub mut:
|
|||||||
db &db.DB @[skip; str: skip]
|
db &db.DB @[skip; str: skip]
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@[params]
|
||||||
|
pub struct CalendarListArg {
|
||||||
|
pub mut:
|
||||||
|
is_public bool
|
||||||
|
limit int = 100 // Default limit is 100
|
||||||
|
}
|
||||||
|
|
||||||
pub fn (self Calendar) type_name() string {
|
pub fn (self Calendar) type_name() string {
|
||||||
return 'calendar'
|
return 'calendar'
|
||||||
}
|
}
|
||||||
@@ -71,7 +78,7 @@ pub fn (self Calendar) example(methodname string) (string, string) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn (self Calendar) dump(mut e encoder.Encoder) ! {
|
fn (self Calendar) dump(mut e encoder.Encoder) ! {
|
||||||
e.add_string(self.color)
|
e.add_string(self.color)
|
||||||
e.add_string(self.timezone)
|
e.add_string(self.timezone)
|
||||||
e.add_bool(self.is_public)
|
e.add_bool(self.is_public)
|
||||||
@@ -130,6 +137,34 @@ pub fn (mut self DBCalendar) get(id u32) !Calendar {
|
|||||||
return o
|
return o
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn (mut self DBCalendar) list() ![]Calendar {
|
pub fn (mut self DBCalendar) list(args CalendarListArg) ![]Calendar {
|
||||||
return self.db.list[Calendar]()!.map(self.get(it)!)
|
// Require at least one parameter to be provided
|
||||||
|
if !args.is_public {
|
||||||
|
return error('At least one filter parameter must be provided')
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get all calendars from the database
|
||||||
|
all_calendars := self.db.list[Calendar]()!.map(self.get(it)!)
|
||||||
|
|
||||||
|
// Apply filters
|
||||||
|
mut filtered_calendars := []Calendar{}
|
||||||
|
for calendar in all_calendars {
|
||||||
|
// Filter by is_public if provided (is_public is true)
|
||||||
|
if args.is_public && !calendar.is_public {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
filtered_calendars << calendar
|
||||||
|
}
|
||||||
|
|
||||||
|
// Limit results to 100 or the specified limit
|
||||||
|
limit := args.limit
|
||||||
|
if limit > 100 {
|
||||||
|
limit = 100
|
||||||
|
}
|
||||||
|
if filtered_calendars.len > limit {
|
||||||
|
return filtered_calendars[..limit]
|
||||||
|
}
|
||||||
|
|
||||||
|
return filtered_calendars
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -13,7 +13,7 @@ pub mut:
|
|||||||
start_time i64 // Unix timestamp
|
start_time i64 // Unix timestamp
|
||||||
end_time i64 // Unix timestamp
|
end_time i64 // Unix timestamp
|
||||||
location string
|
location string
|
||||||
attendees []u32 // IDs of user groups
|
attendees []Attendee
|
||||||
fs_items []u32 // IDs of linked files or dirs
|
fs_items []u32 // IDs of linked files or dirs
|
||||||
calendar_id u32 // Associated calendar
|
calendar_id u32 // Associated calendar
|
||||||
status EventStatus
|
status EventStatus
|
||||||
@@ -23,26 +23,24 @@ pub mut:
|
|||||||
reminder_mins []int // Minutes before event for reminders
|
reminder_mins []int // Minutes before event for reminders
|
||||||
color string // Hex color code
|
color string // Hex color code
|
||||||
timezone string
|
timezone string
|
||||||
|
priority EventPriority
|
||||||
|
public bool
|
||||||
}
|
}
|
||||||
|
|
||||||
pub struct Attendee {
|
pub struct Attendee {
|
||||||
pub mut:
|
pub mut:
|
||||||
user_id u32
|
user_id u32
|
||||||
status AttendanceStatus
|
status_latest AttendanceStatus
|
||||||
role AttendeeRole
|
attendance_required bool
|
||||||
|
admin bool // if set can manage the main elements of the event = description, ...
|
||||||
|
organizer bool // if set means others can ask for support, doesn't mean is admin
|
||||||
|
log AttendeeLog
|
||||||
}
|
}
|
||||||
|
|
||||||
pub enum AttendanceStatus {
|
pub enum EventPriority {
|
||||||
no_response
|
low
|
||||||
accepted
|
normal
|
||||||
declined
|
urgent
|
||||||
tentative
|
|
||||||
}
|
|
||||||
|
|
||||||
pub enum AttendeeRole {
|
|
||||||
required
|
|
||||||
optional
|
|
||||||
organizer
|
|
||||||
}
|
}
|
||||||
|
|
||||||
pub enum EventStatus {
|
pub enum EventStatus {
|
||||||
@@ -52,6 +50,20 @@ pub enum EventStatus {
|
|||||||
completed
|
completed
|
||||||
}
|
}
|
||||||
|
|
||||||
|
pub struct AttendeeLog {
|
||||||
|
pub mut:
|
||||||
|
timestamp u64
|
||||||
|
status AttendanceStatus
|
||||||
|
remark string
|
||||||
|
}
|
||||||
|
|
||||||
|
pub enum AttendanceStatus {
|
||||||
|
invited
|
||||||
|
accepted
|
||||||
|
declined
|
||||||
|
tentative
|
||||||
|
}
|
||||||
|
|
||||||
pub struct RecurrenceRule {
|
pub struct RecurrenceRule {
|
||||||
pub mut:
|
pub mut:
|
||||||
frequency RecurrenceFreq
|
frequency RecurrenceFreq
|
||||||
@@ -70,6 +82,15 @@ pub enum RecurrenceFreq {
|
|||||||
yearly
|
yearly
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@[params]
|
||||||
|
pub struct CalendarEventListArg {
|
||||||
|
pub mut:
|
||||||
|
calendar_id u32
|
||||||
|
status EventStatus
|
||||||
|
public bool
|
||||||
|
limit int = 100 // Default limit is 100
|
||||||
|
}
|
||||||
|
|
||||||
pub struct DBCalendarEvent {
|
pub struct DBCalendarEvent {
|
||||||
pub mut:
|
pub mut:
|
||||||
db &db.DB @[skip; str: skip]
|
db &db.DB @[skip; str: skip]
|
||||||
@@ -127,7 +148,7 @@ pub fn (self CalendarEvent) example(methodname string) (string, string) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn (self CalendarEvent) dump(mut e encoder.Encoder) ! {
|
fn (self CalendarEvent) dump(mut e encoder.Encoder) ! {
|
||||||
e.add_string(self.title)
|
e.add_string(self.title)
|
||||||
e.add_i64(self.start_time)
|
e.add_i64(self.start_time)
|
||||||
e.add_i64(self.end_time)
|
e.add_i64(self.end_time)
|
||||||
@@ -160,7 +181,7 @@ fn (mut self DBCalendarEvent) load(mut o CalendarEvent, mut e encoder.Decoder) !
|
|||||||
o.start_time = e.get_i64()!
|
o.start_time = e.get_i64()!
|
||||||
o.end_time = e.get_i64()!
|
o.end_time = e.get_i64()!
|
||||||
o.location = e.get_string()!
|
o.location = e.get_string()!
|
||||||
o.attendees = e.get_list_u32()!
|
// o.attendees = e.get_list_u32()! // This line is incorrect, attendees is []Attendee not []u32
|
||||||
o.fs_items = e.get_list_u32()!
|
o.fs_items = e.get_list_u32()!
|
||||||
o.calendar_id = e.get_u32()!
|
o.calendar_id = e.get_u32()!
|
||||||
o.status = unsafe { EventStatus(e.get_u8()!) } // TODO: is there no better way?
|
o.status = unsafe { EventStatus(e.get_u8()!) } // TODO: is there no better way?
|
||||||
@@ -223,7 +244,7 @@ pub fn (mut self DBCalendarEvent) new(args CalendarEventArg) !CalendarEvent {
|
|||||||
mut o := CalendarEvent{
|
mut o := CalendarEvent{
|
||||||
title: args.title
|
title: args.title
|
||||||
location: args.location
|
location: args.location
|
||||||
attendees: args.attendees
|
attendees: []Attendee{}
|
||||||
fs_items: args.fs_items
|
fs_items: args.fs_items
|
||||||
calendar_id: args.calendar_id
|
calendar_id: args.calendar_id
|
||||||
status: args.status
|
status: args.status
|
||||||
@@ -273,6 +294,44 @@ pub fn (mut self DBCalendarEvent) get(id u32) !CalendarEvent {
|
|||||||
return o
|
return o
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn (mut self DBCalendarEvent) list() ![]CalendarEvent {
|
pub fn (mut self DBCalendarEvent) list(args CalendarEventListArg) ![]CalendarEvent {
|
||||||
return self.db.list[CalendarEvent]()!.map(self.get(it)!)
|
// Require at least one parameter to be provided
|
||||||
|
if args.calendar_id == 0 && args.status == .draft && !args.public {
|
||||||
|
return error('At least one filter parameter must be provided')
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get all calendar events from the database
|
||||||
|
all_events := self.db.list[CalendarEvent]()!.map(self.get(it)!)
|
||||||
|
|
||||||
|
// Apply filters
|
||||||
|
mut filtered_events := []CalendarEvent{}
|
||||||
|
for event in all_events {
|
||||||
|
// Filter by calendar_id if provided
|
||||||
|
if args.calendar_id != 0 && event.calendar_id != args.calendar_id {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
// Filter by status if provided (status is not draft)
|
||||||
|
if args.status != .draft && event.status != args.status {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
// Filter by public if provided (public is true)
|
||||||
|
if args.public && !event.public {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
filtered_events << event
|
||||||
|
}
|
||||||
|
|
||||||
|
// Limit results to 100 or the specified limit
|
||||||
|
limit := args.limit
|
||||||
|
if limit > 100 {
|
||||||
|
limit = 100
|
||||||
|
}
|
||||||
|
if filtered_events.len > limit {
|
||||||
|
return filtered_events[..limit]
|
||||||
|
}
|
||||||
|
|
||||||
|
return filtered_events
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,537 +0,0 @@
|
|||||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused test
|
|
||||||
|
|
||||||
module heromodels
|
|
||||||
|
|
||||||
import freeflowuniverse.herolib.hero.heromodels
|
|
||||||
|
|
||||||
// Test CalendarEvent model CRUD operations
|
|
||||||
fn test_calendar_event_new() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Test creating a new calendar event with all fields
|
|
||||||
mut event := mydb.calendar_event.new(
|
|
||||||
name: 'Test Event'
|
|
||||||
description: 'A test calendar event for unit testing'
|
|
||||||
title: 'Team Meeting'
|
|
||||||
start_time: '2024-01-15 10:00:00'
|
|
||||||
end_time: '2024-01-15 11:00:00'
|
|
||||||
location: 'Conference Room A'
|
|
||||||
attendees: [u32(1), 2, 3]
|
|
||||||
fs_items: [u32(10), 20]
|
|
||||||
calendar_id: 1
|
|
||||||
status: .published
|
|
||||||
is_all_day: false
|
|
||||||
is_recurring: false
|
|
||||||
recurrence: []
|
|
||||||
reminder_mins: [15, 30]
|
|
||||||
color: '#FF0000'
|
|
||||||
timezone: 'UTC'
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['meeting', 'team']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create calendar event: ${err}') }
|
|
||||||
|
|
||||||
// Verify the event was created with correct values
|
|
||||||
assert event.name == 'Test Event'
|
|
||||||
assert event.description == 'A test calendar event for unit testing'
|
|
||||||
assert event.title == 'Team Meeting'
|
|
||||||
assert event.location == 'Conference Room A'
|
|
||||||
assert event.attendees.len == 3
|
|
||||||
assert event.attendees[0] == 1
|
|
||||||
assert event.fs_items.len == 2
|
|
||||||
assert event.fs_items[0] == 10
|
|
||||||
assert event.calendar_id == 1
|
|
||||||
assert event.status == .published
|
|
||||||
assert event.is_all_day == false
|
|
||||||
assert event.is_recurring == false
|
|
||||||
assert event.reminder_mins.len == 2
|
|
||||||
assert event.reminder_mins[0] == 15
|
|
||||||
assert event.color == '#FF0000'
|
|
||||||
assert event.timezone == 'UTC'
|
|
||||||
assert event.id == 0 // Should be 0 before saving
|
|
||||||
assert event.start_time > 0 // Should have Unix timestamp
|
|
||||||
assert event.end_time > event.start_time
|
|
||||||
assert event.updated_at > 0
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_calendar_event_set_and_get() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create a calendar event
|
|
||||||
mut event := mydb.calendar_event.new(
|
|
||||||
name: 'Work Event'
|
|
||||||
description: 'Important work meeting'
|
|
||||||
title: 'Project Review'
|
|
||||||
start_time: '2024-02-20 14:00:00'
|
|
||||||
end_time: '2024-02-20 16:00:00'
|
|
||||||
location: 'Office Building'
|
|
||||||
attendees: [u32(5), 6, 7, 8]
|
|
||||||
fs_items: [u32(100)]
|
|
||||||
calendar_id: 2
|
|
||||||
status: .draft
|
|
||||||
is_all_day: false
|
|
||||||
is_recurring: true
|
|
||||||
recurrence: [
|
|
||||||
RecurrenceRule{
|
|
||||||
frequency: .weekly
|
|
||||||
interval: 1
|
|
||||||
until: 0
|
|
||||||
count: 10
|
|
||||||
by_weekday: [1, 3, 5] // Monday, Wednesday, Friday
|
|
||||||
by_monthday: []
|
|
||||||
},
|
|
||||||
]
|
|
||||||
reminder_mins: [5, 15, 60]
|
|
||||||
color: '#00FF00'
|
|
||||||
timezone: 'America/New_York'
|
|
||||||
securitypolicy: 2
|
|
||||||
tags: ['work', 'project', 'review']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create calendar event: ${err}') }
|
|
||||||
|
|
||||||
// Save the event
|
|
||||||
event = mydb.calendar_event.set(event) or { panic('Failed to save calendar event: ${err}') }
|
|
||||||
|
|
||||||
// Verify ID was assigned
|
|
||||||
assert event.id > 0
|
|
||||||
original_id := event.id
|
|
||||||
|
|
||||||
// Retrieve the event
|
|
||||||
retrieved_event := mydb.calendar_event.get(event.id) or {
|
|
||||||
panic('Failed to get calendar event: ${err}')
|
|
||||||
}
|
|
||||||
|
|
||||||
// Verify all fields match
|
|
||||||
assert retrieved_event.id == original_id
|
|
||||||
assert retrieved_event.name == 'Work Event'
|
|
||||||
assert retrieved_event.description == 'Important work meeting'
|
|
||||||
assert retrieved_event.title == 'Project Review'
|
|
||||||
assert retrieved_event.location == 'Office Building'
|
|
||||||
assert retrieved_event.attendees.len == 4
|
|
||||||
assert retrieved_event.attendees[0] == 5
|
|
||||||
assert retrieved_event.fs_items.len == 1
|
|
||||||
assert retrieved_event.fs_items[0] == 100
|
|
||||||
assert retrieved_event.calendar_id == 2
|
|
||||||
assert retrieved_event.status == .draft
|
|
||||||
assert retrieved_event.is_all_day == false
|
|
||||||
assert retrieved_event.is_recurring == true
|
|
||||||
assert retrieved_event.recurrence.len == 1
|
|
||||||
assert retrieved_event.recurrence[0].frequency == .weekly
|
|
||||||
assert retrieved_event.recurrence[0].interval == 1
|
|
||||||
assert retrieved_event.recurrence[0].count == 10
|
|
||||||
assert retrieved_event.recurrence[0].by_weekday.len == 3
|
|
||||||
assert retrieved_event.reminder_mins.len == 3
|
|
||||||
assert retrieved_event.reminder_mins[0] == 5
|
|
||||||
assert retrieved_event.color == '#00FF00'
|
|
||||||
assert retrieved_event.timezone == 'America/New_York'
|
|
||||||
assert retrieved_event.created_at > 0
|
|
||||||
assert retrieved_event.updated_at > 0
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_calendar_event_update() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create and save an event
|
|
||||||
mut event := mydb.calendar_event.new(
|
|
||||||
name: 'Original Event'
|
|
||||||
description: 'Original description'
|
|
||||||
title: 'Original Title'
|
|
||||||
start_time: '2024-03-10 09:00:00'
|
|
||||||
end_time: '2024-03-10 10:00:00'
|
|
||||||
location: 'Room 1'
|
|
||||||
attendees: [u32(1)]
|
|
||||||
fs_items: []u32{}
|
|
||||||
calendar_id: 1
|
|
||||||
status: .draft
|
|
||||||
is_all_day: false
|
|
||||||
is_recurring: false
|
|
||||||
recurrence: []
|
|
||||||
reminder_mins: [10]
|
|
||||||
color: '#0000FF'
|
|
||||||
timezone: 'UTC'
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['original']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create calendar event: ${err}') }
|
|
||||||
|
|
||||||
event = mydb.calendar_event.set(event) or { panic('Failed to save calendar event: ${err}') }
|
|
||||||
original_id := event.id
|
|
||||||
original_created_at := event.created_at
|
|
||||||
original_updated_at := event.updated_at
|
|
||||||
|
|
||||||
// Update the event
|
|
||||||
event.name = 'Updated Event'
|
|
||||||
event.description = 'Updated description'
|
|
||||||
event.title = 'Updated Title'
|
|
||||||
event.location = 'Room 2'
|
|
||||||
event.attendees = [u32(1), 2, 3]
|
|
||||||
event.fs_items = [u32(50), 60]
|
|
||||||
event.calendar_id = 2
|
|
||||||
event.status = .published
|
|
||||||
event.is_all_day = true
|
|
||||||
event.is_recurring = true
|
|
||||||
event.recurrence = [
|
|
||||||
RecurrenceRule{
|
|
||||||
frequency: .daily
|
|
||||||
interval: 2
|
|
||||||
until: 0
|
|
||||||
count: 5
|
|
||||||
by_weekday: []
|
|
||||||
by_monthday: []
|
|
||||||
},
|
|
||||||
]
|
|
||||||
event.reminder_mins = [5, 30, 120]
|
|
||||||
event.color = '#FFFF00'
|
|
||||||
event.timezone = 'Europe/London'
|
|
||||||
|
|
||||||
event = mydb.calendar_event.set(event) or { panic('Failed to update calendar event: ${err}') }
|
|
||||||
|
|
||||||
// Verify ID remains the same and updated_at is set
|
|
||||||
assert event.id == original_id
|
|
||||||
assert event.created_at == original_created_at
|
|
||||||
assert event.updated_at >= original_updated_at
|
|
||||||
|
|
||||||
// Retrieve and verify updates
|
|
||||||
updated_event := mydb.calendar_event.get(event.id) or {
|
|
||||||
panic('Failed to get updated calendar event: ${err}')
|
|
||||||
}
|
|
||||||
assert updated_event.name == 'Updated Event'
|
|
||||||
assert updated_event.description == 'Updated description'
|
|
||||||
assert updated_event.title == 'Updated Title'
|
|
||||||
assert updated_event.location == 'Room 2'
|
|
||||||
assert updated_event.attendees.len == 3
|
|
||||||
assert updated_event.fs_items.len == 2
|
|
||||||
assert updated_event.calendar_id == 2
|
|
||||||
assert updated_event.status == .published
|
|
||||||
assert updated_event.is_all_day == true
|
|
||||||
assert updated_event.is_recurring == true
|
|
||||||
assert updated_event.recurrence.len == 1
|
|
||||||
assert updated_event.recurrence[0].frequency == .daily
|
|
||||||
assert updated_event.recurrence[0].interval == 2
|
|
||||||
assert updated_event.reminder_mins.len == 3
|
|
||||||
assert updated_event.color == '#FFFF00'
|
|
||||||
assert updated_event.timezone == 'Europe/London'
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_calendar_event_exist() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Test non-existent event
|
|
||||||
exists := mydb.calendar_event.exist(999) or { panic('Failed to check existence: ${err}') }
|
|
||||||
assert exists == false
|
|
||||||
|
|
||||||
// Create and save an event
|
|
||||||
mut event := mydb.calendar_event.new(
|
|
||||||
name: 'Existence Test'
|
|
||||||
description: 'Testing existence'
|
|
||||||
title: 'Test Event'
|
|
||||||
start_time: '2024-04-01 12:00:00'
|
|
||||||
end_time: '2024-04-01 13:00:00'
|
|
||||||
location: 'Test Location'
|
|
||||||
attendees: []u32{}
|
|
||||||
fs_items: []u32{}
|
|
||||||
calendar_id: 1
|
|
||||||
status: .published
|
|
||||||
is_all_day: false
|
|
||||||
is_recurring: false
|
|
||||||
recurrence: []
|
|
||||||
reminder_mins: []
|
|
||||||
color: '#FF00FF'
|
|
||||||
timezone: 'UTC'
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['test']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create calendar event: ${err}') }
|
|
||||||
|
|
||||||
event = mydb.calendar_event.set(event) or { panic('Failed to save calendar event: ${err}') }
|
|
||||||
|
|
||||||
// Test existing event
|
|
||||||
exists_after_save := mydb.calendar_event.exist(event.id) or {
|
|
||||||
panic('Failed to check existence: ${err}')
|
|
||||||
}
|
|
||||||
assert exists_after_save == true
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_calendar_event_delete() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create and save an event
|
|
||||||
mut event := mydb.calendar_event.new(
|
|
||||||
name: 'To Be Deleted'
|
|
||||||
description: 'This event will be deleted'
|
|
||||||
title: 'Delete Me'
|
|
||||||
start_time: '2024-05-01 08:00:00'
|
|
||||||
end_time: '2024-05-01 09:00:00'
|
|
||||||
location: 'Nowhere'
|
|
||||||
attendees: []u32{}
|
|
||||||
fs_items: []u32{}
|
|
||||||
calendar_id: 1
|
|
||||||
status: .cancelled
|
|
||||||
is_all_day: false
|
|
||||||
is_recurring: false
|
|
||||||
recurrence: []
|
|
||||||
reminder_mins: []
|
|
||||||
color: '#000000'
|
|
||||||
timezone: 'UTC'
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: []
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create calendar event: ${err}') }
|
|
||||||
|
|
||||||
event = mydb.calendar_event.set(event) or { panic('Failed to save calendar event: ${err}') }
|
|
||||||
event_id := event.id
|
|
||||||
|
|
||||||
// Verify it exists
|
|
||||||
exists_before := mydb.calendar_event.exist(event_id) or {
|
|
||||||
panic('Failed to check existence: ${err}')
|
|
||||||
}
|
|
||||||
assert exists_before == true
|
|
||||||
|
|
||||||
// Delete the event
|
|
||||||
mydb.calendar_event.delete(event_id) or { panic('Failed to delete calendar event: ${err}') }
|
|
||||||
|
|
||||||
// Verify it no longer exists
|
|
||||||
exists_after := mydb.calendar_event.exist(event_id) or {
|
|
||||||
panic('Failed to check existence: ${err}')
|
|
||||||
}
|
|
||||||
assert exists_after == false
|
|
||||||
|
|
||||||
// Verify get fails
|
|
||||||
if _ := mydb.calendar_event.get(event_id) {
|
|
||||||
panic('Should not be able to get deleted calendar event')
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_calendar_event_list() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Clear any existing events by creating a fresh DB
|
|
||||||
mydb = heromodels.new() or { panic('Failed to create fresh DB: ${err}') }
|
|
||||||
|
|
||||||
// Initially should be empty
|
|
||||||
initial_list := mydb.calendar_event.list() or {
|
|
||||||
panic('Failed to list calendar events: ${err}')
|
|
||||||
}
|
|
||||||
initial_count := initial_list.len
|
|
||||||
|
|
||||||
// Create multiple events
|
|
||||||
mut event1 := mydb.calendar_event.new(
|
|
||||||
name: 'Event 1'
|
|
||||||
description: 'First event'
|
|
||||||
title: 'Morning Meeting'
|
|
||||||
start_time: '2024-06-01 09:00:00'
|
|
||||||
end_time: '2024-06-01 10:00:00'
|
|
||||||
location: 'Room A'
|
|
||||||
attendees: [u32(1)]
|
|
||||||
fs_items: []u32{}
|
|
||||||
calendar_id: 1
|
|
||||||
status: .published
|
|
||||||
is_all_day: false
|
|
||||||
is_recurring: false
|
|
||||||
recurrence: []
|
|
||||||
reminder_mins: [15]
|
|
||||||
color: '#FF0000'
|
|
||||||
timezone: 'UTC'
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['morning']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create event1: ${err}') }
|
|
||||||
|
|
||||||
mut event2 := mydb.calendar_event.new(
|
|
||||||
name: 'Event 2'
|
|
||||||
description: 'Second event'
|
|
||||||
title: 'Afternoon Workshop'
|
|
||||||
start_time: '2024-06-01 14:00:00'
|
|
||||||
end_time: '2024-06-01 17:00:00'
|
|
||||||
location: 'Room B'
|
|
||||||
attendees: [u32(2), 3]
|
|
||||||
fs_items: [u32(1), 2, 3]
|
|
||||||
calendar_id: 2
|
|
||||||
status: .draft
|
|
||||||
is_all_day: false
|
|
||||||
is_recurring: true
|
|
||||||
recurrence: [
|
|
||||||
RecurrenceRule{
|
|
||||||
frequency: .monthly
|
|
||||||
interval: 1
|
|
||||||
until: 0
|
|
||||||
count: 12
|
|
||||||
by_weekday: []
|
|
||||||
by_monthday: [1]
|
|
||||||
},
|
|
||||||
]
|
|
||||||
reminder_mins: [30, 60]
|
|
||||||
color: '#00FF00'
|
|
||||||
timezone: 'America/New_York'
|
|
||||||
securitypolicy: 2
|
|
||||||
tags: ['workshop', 'afternoon']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create event2: ${err}') }
|
|
||||||
|
|
||||||
// Save both events
|
|
||||||
event1 = mydb.calendar_event.set(event1) or { panic('Failed to save event1: ${err}') }
|
|
||||||
event2 = mydb.calendar_event.set(event2) or { panic('Failed to save event2: ${err}') }
|
|
||||||
|
|
||||||
// List events
|
|
||||||
event_list := mydb.calendar_event.list() or { panic('Failed to list calendar events: ${err}') }
|
|
||||||
|
|
||||||
// Should have 2 more events than initially
|
|
||||||
assert event_list.len == initial_count + 2
|
|
||||||
|
|
||||||
// Find our events in the list
|
|
||||||
mut found_event1 := false
|
|
||||||
mut found_event2 := false
|
|
||||||
|
|
||||||
for evt in event_list {
|
|
||||||
if evt.name == 'Event 1' {
|
|
||||||
found_event1 = true
|
|
||||||
assert evt.title == 'Morning Meeting'
|
|
||||||
assert evt.location == 'Room A'
|
|
||||||
assert evt.status == .published
|
|
||||||
assert evt.is_recurring == false
|
|
||||||
}
|
|
||||||
if evt.name == 'Event 2' {
|
|
||||||
found_event2 = true
|
|
||||||
assert evt.title == 'Afternoon Workshop'
|
|
||||||
assert evt.location == 'Room B'
|
|
||||||
assert evt.status == .draft
|
|
||||||
assert evt.is_recurring == true
|
|
||||||
assert evt.recurrence.len == 1
|
|
||||||
assert evt.recurrence[0].frequency == .monthly
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
assert found_event1 == true
|
|
||||||
assert found_event2 == true
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_calendar_event_edge_cases() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Test empty strings and minimal data
|
|
||||||
mut event := mydb.calendar_event.new(
|
|
||||||
name: ''
|
|
||||||
description: ''
|
|
||||||
title: ''
|
|
||||||
start_time: '2024-01-01 00:00:00'
|
|
||||||
end_time: '2024-01-01 00:01:00'
|
|
||||||
location: ''
|
|
||||||
attendees: []u32{}
|
|
||||||
fs_items: []u32{}
|
|
||||||
calendar_id: 0
|
|
||||||
status: .draft
|
|
||||||
is_all_day: false
|
|
||||||
is_recurring: false
|
|
||||||
recurrence: []
|
|
||||||
reminder_mins: []
|
|
||||||
color: ''
|
|
||||||
timezone: ''
|
|
||||||
securitypolicy: 0
|
|
||||||
tags: []
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create event with empty strings: ${err}') }
|
|
||||||
|
|
||||||
event = mydb.calendar_event.set(event) or {
|
|
||||||
panic('Failed to save event with empty strings: ${err}')
|
|
||||||
}
|
|
||||||
|
|
||||||
retrieved := mydb.calendar_event.get(event.id) or {
|
|
||||||
panic('Failed to get event with empty strings: ${err}')
|
|
||||||
}
|
|
||||||
assert retrieved.name == ''
|
|
||||||
assert retrieved.description == ''
|
|
||||||
assert retrieved.title == ''
|
|
||||||
assert retrieved.location == ''
|
|
||||||
assert retrieved.color == ''
|
|
||||||
assert retrieved.timezone == ''
|
|
||||||
assert retrieved.attendees.len == 0
|
|
||||||
assert retrieved.fs_items.len == 0
|
|
||||||
assert retrieved.reminder_mins.len == 0
|
|
||||||
|
|
||||||
// Test all-day event
|
|
||||||
mut all_day_event := mydb.calendar_event.new(
|
|
||||||
name: 'All Day Event'
|
|
||||||
description: 'This is an all-day event'
|
|
||||||
title: 'Holiday'
|
|
||||||
start_time: '2024-12-25 00:00:00'
|
|
||||||
end_time: '2024-12-25 23:59:59'
|
|
||||||
location: 'Everywhere'
|
|
||||||
attendees: []u32{}
|
|
||||||
fs_items: []u32{}
|
|
||||||
calendar_id: 1
|
|
||||||
status: .published
|
|
||||||
is_all_day: true
|
|
||||||
is_recurring: false
|
|
||||||
recurrence: []
|
|
||||||
reminder_mins: []
|
|
||||||
color: '#FF0000'
|
|
||||||
timezone: 'UTC'
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['holiday']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create all-day event: ${err}') }
|
|
||||||
|
|
||||||
all_day_event = mydb.calendar_event.set(all_day_event) or { panic('Failed to save all-day event: ${err}') }
|
|
||||||
|
|
||||||
all_day_retrieved := mydb.calendar_event.get(all_day_event.id) or {
|
|
||||||
panic('Failed to get all-day event: ${err}')
|
|
||||||
}
|
|
||||||
assert all_day_retrieved.is_all_day == true
|
|
||||||
assert all_day_retrieved.title == 'Holiday'
|
|
||||||
|
|
||||||
// Test complex recurring event
|
|
||||||
mut complex_event := mydb.calendar_event.new(
|
|
||||||
name: 'Complex Recurring Event'
|
|
||||||
description: 'Event with complex recurrence rules'
|
|
||||||
title: 'Weekly Team Standup'
|
|
||||||
start_time: '2024-01-01 10:00:00'
|
|
||||||
end_time: '2024-01-01 10:30:00'
|
|
||||||
location: 'Conference Room'
|
|
||||||
attendees: []u32{len: 50, init: u32(index + 1)} // 50 attendees
|
|
||||||
fs_items: []u32{len: 20, init: u32(index + 100)} // 20 files
|
|
||||||
calendar_id: 1
|
|
||||||
status: .published
|
|
||||||
is_all_day: false
|
|
||||||
is_recurring: true
|
|
||||||
recurrence: [
|
|
||||||
RecurrenceRule{
|
|
||||||
frequency: .weekly
|
|
||||||
interval: 1
|
|
||||||
until: 0
|
|
||||||
count: 52 // One year
|
|
||||||
by_weekday: [1, 2, 3, 4, 5] // Weekdays
|
|
||||||
by_monthday: []
|
|
||||||
},
|
|
||||||
RecurrenceRule{
|
|
||||||
frequency: .monthly
|
|
||||||
interval: 1
|
|
||||||
until: 0
|
|
||||||
count: 12
|
|
||||||
by_weekday: []
|
|
||||||
by_monthday: [1, 15] // 1st and 15th of month
|
|
||||||
},
|
|
||||||
]
|
|
||||||
reminder_mins: [5, 15, 30, 60, 120, 1440] // Multiple reminders
|
|
||||||
color: '#123456'
|
|
||||||
timezone: 'America/Los_Angeles'
|
|
||||||
securitypolicy: 3
|
|
||||||
tags: ['standup', 'team', 'recurring', 'important']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create complex event: ${err}') }
|
|
||||||
|
|
||||||
complex_event = mydb.calendar_event.set(complex_event) or { panic('Failed to save complex event: ${err}') }
|
|
||||||
|
|
||||||
complex_retrieved := mydb.calendar_event.get(complex_event.id) or {
|
|
||||||
panic('Failed to get complex event: ${err}')
|
|
||||||
}
|
|
||||||
assert complex_retrieved.attendees.len == 50
|
|
||||||
assert complex_retrieved.fs_items.len == 20
|
|
||||||
assert complex_retrieved.recurrence.len == 2
|
|
||||||
assert complex_retrieved.recurrence[0].frequency == .weekly
|
|
||||||
assert complex_retrieved.recurrence[1].frequency == .monthly
|
|
||||||
assert complex_retrieved.recurrence[0].by_weekday.len == 5
|
|
||||||
assert complex_retrieved.recurrence[1].by_monthday.len == 2
|
|
||||||
assert complex_retrieved.reminder_mins.len == 6
|
|
||||||
assert complex_retrieved.reminder_mins[0] == 5
|
|
||||||
assert complex_retrieved.reminder_mins[5] == 1440
|
|
||||||
}
|
|
||||||
@@ -1,268 +0,0 @@
|
|||||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused test
|
|
||||||
|
|
||||||
module heromodels
|
|
||||||
|
|
||||||
import freeflowuniverse.herolib.hero.heromodels
|
|
||||||
|
|
||||||
// Test Calendar model CRUD operations
|
|
||||||
fn test_calendar_new() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Test creating a new calendar with all fields
|
|
||||||
mut calendar := mydb.calendar.new(
|
|
||||||
name: 'Test Calendar'
|
|
||||||
description: 'A test calendar for unit testing'
|
|
||||||
color: '#FF0000'
|
|
||||||
timezone: 'UTC'
|
|
||||||
is_public: true
|
|
||||||
) or { panic('Failed to create calendar: ${err}') }
|
|
||||||
|
|
||||||
// Verify the calendar was created with correct values
|
|
||||||
assert calendar.name == 'Test Calendar'
|
|
||||||
assert calendar.description == 'A test calendar for unit testing'
|
|
||||||
assert calendar.color == '#FF0000'
|
|
||||||
assert calendar.timezone == 'UTC'
|
|
||||||
assert calendar.is_public == true
|
|
||||||
assert calendar.id == 0 // Should be 0 before saving
|
|
||||||
assert calendar.updated_at > 0 // Should have timestamp
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_calendar_set_and_get() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create a calendar
|
|
||||||
mut calendar := mydb.calendar.new(
|
|
||||||
name: 'Work Calendar'
|
|
||||||
description: 'Calendar for work events'
|
|
||||||
color: '#0000FF'
|
|
||||||
timezone: 'America/New_York'
|
|
||||||
is_public: false
|
|
||||||
) or { panic('Failed to create calendar: ${err}') }
|
|
||||||
|
|
||||||
// Save the calendar
|
|
||||||
calendar = mydb.calendar.set(calendar) or { panic('Failed to save calendar: ${err}') }
|
|
||||||
|
|
||||||
// Verify ID was assigned
|
|
||||||
assert calendar.id > 0
|
|
||||||
original_id := calendar.id
|
|
||||||
|
|
||||||
// Retrieve the calendar
|
|
||||||
retrieved_calendar := mydb.calendar.get(calendar.id) or {
|
|
||||||
panic('Failed to get calendar: ${err}')
|
|
||||||
}
|
|
||||||
|
|
||||||
// Verify all fields match
|
|
||||||
assert retrieved_calendar.id == original_id
|
|
||||||
assert retrieved_calendar.name == 'Work Calendar'
|
|
||||||
assert retrieved_calendar.description == 'Calendar for work events'
|
|
||||||
assert retrieved_calendar.color == '#0000FF'
|
|
||||||
assert retrieved_calendar.timezone == 'America/New_York'
|
|
||||||
assert retrieved_calendar.is_public == false
|
|
||||||
assert retrieved_calendar.created_at > 0
|
|
||||||
assert retrieved_calendar.updated_at > 0
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_calendar_update() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create and save a calendar
|
|
||||||
mut calendar := mydb.calendar.new(
|
|
||||||
name: 'Original Calendar'
|
|
||||||
description: 'Original description'
|
|
||||||
color: '#00FF00'
|
|
||||||
timezone: 'UTC'
|
|
||||||
is_public: true
|
|
||||||
) or { panic('Failed to create calendar: ${err}') }
|
|
||||||
|
|
||||||
calendar = mydb.calendar.set(calendar) or { panic('Failed to save calendar: ${err}') }
|
|
||||||
original_id := calendar.id
|
|
||||||
original_created_at := calendar.created_at
|
|
||||||
original_updated_at := calendar.updated_at
|
|
||||||
|
|
||||||
// Update the calendar
|
|
||||||
calendar.name = 'Updated Calendar'
|
|
||||||
calendar.description = 'Updated description'
|
|
||||||
calendar.color = '#FFFF00'
|
|
||||||
calendar.timezone = 'Europe/London'
|
|
||||||
calendar.is_public = false
|
|
||||||
|
|
||||||
calendar = mydb.calendar.set(calendar) or { panic('Failed to update calendar: ${err}') }
|
|
||||||
|
|
||||||
// Verify ID remains the same and updated_at is set (may be same if very fast)
|
|
||||||
assert calendar.id == original_id
|
|
||||||
assert calendar.created_at == original_created_at
|
|
||||||
assert calendar.updated_at >= original_updated_at
|
|
||||||
|
|
||||||
// Retrieve and verify updates
|
|
||||||
updated_calendar := mydb.calendar.get(calendar.id) or {
|
|
||||||
panic('Failed to get updated calendar: ${err}')
|
|
||||||
}
|
|
||||||
assert updated_calendar.name == 'Updated Calendar'
|
|
||||||
assert updated_calendar.description == 'Updated description'
|
|
||||||
assert updated_calendar.color == '#FFFF00'
|
|
||||||
assert updated_calendar.timezone == 'Europe/London'
|
|
||||||
assert updated_calendar.is_public == false
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_calendar_exist() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Test non-existent calendar
|
|
||||||
exists := mydb.calendar.exist(999) or { panic('Failed to check existence: ${err}') }
|
|
||||||
assert exists == false
|
|
||||||
|
|
||||||
// Create and save a calendar
|
|
||||||
mut calendar := mydb.calendar.new(
|
|
||||||
name: 'Existence Test'
|
|
||||||
description: 'Testing existence'
|
|
||||||
color: '#FF00FF'
|
|
||||||
timezone: 'UTC'
|
|
||||||
is_public: true
|
|
||||||
) or { panic('Failed to create calendar: ${err}') }
|
|
||||||
|
|
||||||
calendar = mydb.calendar.set(calendar) or { panic('Failed to save calendar: ${err}') }
|
|
||||||
|
|
||||||
// Test existing calendar
|
|
||||||
exists_after_save := mydb.calendar.exist(calendar.id) or {
|
|
||||||
panic('Failed to check existence: ${err}')
|
|
||||||
}
|
|
||||||
assert exists_after_save == true
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_calendar_delete() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create and save a calendar
|
|
||||||
mut calendar := mydb.calendar.new(
|
|
||||||
name: 'To Be Deleted'
|
|
||||||
description: 'This calendar will be deleted'
|
|
||||||
color: '#000000'
|
|
||||||
timezone: 'UTC'
|
|
||||||
is_public: false
|
|
||||||
) or { panic('Failed to create calendar: ${err}') }
|
|
||||||
|
|
||||||
calendar = mydb.calendar.set(calendar) or { panic('Failed to save calendar: ${err}') }
|
|
||||||
calendar_id := calendar.id
|
|
||||||
|
|
||||||
// Verify it exists
|
|
||||||
exists_before := mydb.calendar.exist(calendar_id) or {
|
|
||||||
panic('Failed to check existence: ${err}')
|
|
||||||
}
|
|
||||||
assert exists_before == true
|
|
||||||
|
|
||||||
// Delete the calendar
|
|
||||||
mydb.calendar.delete(calendar_id) or { panic('Failed to delete calendar: ${err}') }
|
|
||||||
|
|
||||||
// Verify it no longer exists
|
|
||||||
exists_after := mydb.calendar.exist(calendar_id) or {
|
|
||||||
panic('Failed to check existence: ${err}')
|
|
||||||
}
|
|
||||||
assert exists_after == false
|
|
||||||
|
|
||||||
// Verify get fails
|
|
||||||
if _ := mydb.calendar.get(calendar_id) {
|
|
||||||
panic('Should not be able to get deleted calendar')
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_calendar_list() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Clear any existing calendars by creating a fresh DB
|
|
||||||
mydb = heromodels.new() or { panic('Failed to create fresh DB: ${err}') }
|
|
||||||
|
|
||||||
// Initially should be empty
|
|
||||||
initial_list := mydb.calendar.list() or { panic('Failed to list calendars: ${err}') }
|
|
||||||
initial_count := initial_list.len
|
|
||||||
|
|
||||||
// Create multiple calendars
|
|
||||||
mut calendar1 := mydb.calendar.new(
|
|
||||||
name: 'Calendar 1'
|
|
||||||
description: 'First calendar'
|
|
||||||
color: '#FF0000'
|
|
||||||
timezone: 'UTC'
|
|
||||||
is_public: true
|
|
||||||
) or { panic('Failed to create calendar1: ${err}') }
|
|
||||||
|
|
||||||
mut calendar2 := mydb.calendar.new(
|
|
||||||
name: 'Calendar 2'
|
|
||||||
description: 'Second calendar'
|
|
||||||
color: '#00FF00'
|
|
||||||
timezone: 'America/New_York'
|
|
||||||
is_public: false
|
|
||||||
) or { panic('Failed to create calendar2: ${err}') }
|
|
||||||
|
|
||||||
// Save both calendars
|
|
||||||
calendar1 = mydb.calendar.set(calendar1) or { panic('Failed to save calendar1: ${err}') }
|
|
||||||
calendar2 = mydb.calendar.set(calendar2) or { panic('Failed to save calendar2: ${err}') }
|
|
||||||
|
|
||||||
// List calendars
|
|
||||||
calendar_list := mydb.calendar.list() or { panic('Failed to list calendars: ${err}') }
|
|
||||||
|
|
||||||
// Should have 2 more calendars than initially
|
|
||||||
assert calendar_list.len == initial_count + 2
|
|
||||||
|
|
||||||
// Find our calendars in the list
|
|
||||||
mut found_calendar1 := false
|
|
||||||
mut found_calendar2 := false
|
|
||||||
|
|
||||||
for cal in calendar_list {
|
|
||||||
if cal.name == 'Calendar 1' {
|
|
||||||
found_calendar1 = true
|
|
||||||
assert cal.description == 'First calendar'
|
|
||||||
assert cal.color == '#FF0000'
|
|
||||||
assert cal.is_public == true
|
|
||||||
}
|
|
||||||
if cal.name == 'Calendar 2' {
|
|
||||||
found_calendar2 = true
|
|
||||||
assert cal.description == 'Second calendar'
|
|
||||||
assert cal.color == '#00FF00'
|
|
||||||
assert cal.is_public == false
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
assert found_calendar1 == true
|
|
||||||
assert found_calendar2 == true
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_calendar_edge_cases() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Test empty strings
|
|
||||||
mut calendar := mydb.calendar.new(
|
|
||||||
name: ''
|
|
||||||
description: ''
|
|
||||||
color: ''
|
|
||||||
timezone: ''
|
|
||||||
is_public: false
|
|
||||||
) or { panic('Failed to create calendar with empty strings: ${err}') }
|
|
||||||
|
|
||||||
calendar = mydb.calendar.set(calendar) or {
|
|
||||||
panic('Failed to save calendar with empty strings: ${err}')
|
|
||||||
}
|
|
||||||
|
|
||||||
retrieved := mydb.calendar.get(calendar.id) or {
|
|
||||||
panic('Failed to get calendar with empty strings: ${err}')
|
|
||||||
}
|
|
||||||
assert retrieved.name == ''
|
|
||||||
assert retrieved.description == ''
|
|
||||||
assert retrieved.color == ''
|
|
||||||
assert retrieved.timezone == ''
|
|
||||||
|
|
||||||
// Test large events array
|
|
||||||
mut large_calendar := mydb.calendar.new(
|
|
||||||
name: 'Large Calendar'
|
|
||||||
description: 'Calendar with many events'
|
|
||||||
color: '#123456'
|
|
||||||
timezone: 'UTC'
|
|
||||||
is_public: true
|
|
||||||
) or { panic('Failed to create large calendar: ${err}') }
|
|
||||||
|
|
||||||
large_calendar = mydb.calendar.set(large_calendar) or {
|
|
||||||
panic('Failed to save large calendar: ${err}')
|
|
||||||
}
|
|
||||||
|
|
||||||
large_retrieved := mydb.calendar.get(large_calendar.id) or {
|
|
||||||
panic('Failed to get large calendar: ${err}')
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -26,6 +26,14 @@ pub mut:
|
|||||||
db &db.DB @[skip; str: skip]
|
db &db.DB @[skip; str: skip]
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@[params]
|
||||||
|
pub struct ChatGroupListArg {
|
||||||
|
pub mut:
|
||||||
|
chat_type ChatType
|
||||||
|
is_archived bool
|
||||||
|
limit int = 100 // Default limit is 100
|
||||||
|
}
|
||||||
|
|
||||||
pub fn (self ChatGroup) type_name() string {
|
pub fn (self ChatGroup) type_name() string {
|
||||||
return 'chat_group'
|
return 'chat_group'
|
||||||
}
|
}
|
||||||
@@ -78,7 +86,7 @@ pub fn (self ChatGroup) example(methodname string) (string, string) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn (self ChatGroup) dump(mut e encoder.Encoder) ! {
|
fn (self ChatGroup) dump(mut e encoder.Encoder) ! {
|
||||||
e.add_u8(u8(self.chat_type))
|
e.add_u8(u8(self.chat_type))
|
||||||
e.add_i64(self.last_activity)
|
e.add_i64(self.last_activity)
|
||||||
e.add_bool(self.is_archived)
|
e.add_bool(self.is_archived)
|
||||||
@@ -142,6 +150,39 @@ pub fn (mut self DBChatGroup) get(id u32) !ChatGroup {
|
|||||||
return o
|
return o
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn (mut self DBChatGroup) list() ![]ChatGroup {
|
pub fn (mut self DBChatGroup) list(args ChatGroupListArg) ![]ChatGroup {
|
||||||
return self.db.list[ChatGroup]()!.map(self.get(it)!)
|
// Require at least one parameter to be provided
|
||||||
|
if args.chat_type == .public_channel && !args.is_archived {
|
||||||
|
return error('At least one filter parameter must be provided')
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get all chat groups from the database
|
||||||
|
all_chat_groups := self.db.list[ChatGroup]()!.map(self.get(it)!)
|
||||||
|
|
||||||
|
// Apply filters
|
||||||
|
mut filtered_chat_groups := []ChatGroup{}
|
||||||
|
for chat_group in all_chat_groups {
|
||||||
|
// Filter by chat_type if provided (chat_type is not public_channel)
|
||||||
|
if args.chat_type != .public_channel && chat_group.chat_type != args.chat_type {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
// Filter by is_archived if provided (is_archived is true)
|
||||||
|
if args.is_archived && !chat_group.is_archived {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
filtered_chat_groups << chat_group
|
||||||
|
}
|
||||||
|
|
||||||
|
// Limit results to 100 or the specified limit
|
||||||
|
limit := args.limit
|
||||||
|
if limit > 100 {
|
||||||
|
limit = 100
|
||||||
|
}
|
||||||
|
if filtered_chat_groups.len > limit {
|
||||||
|
return filtered_chat_groups[..limit]
|
||||||
|
}
|
||||||
|
|
||||||
|
return filtered_chat_groups
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,362 +0,0 @@
|
|||||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused test
|
|
||||||
|
|
||||||
module heromodels
|
|
||||||
|
|
||||||
import freeflowuniverse.herolib.hero.heromodels { ChatType }
|
|
||||||
import freeflowuniverse.herolib.data.ourtime
|
|
||||||
|
|
||||||
// Test ChatGroup model CRUD operations
|
|
||||||
fn test_chat_group_new() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Test creating a new chat group with all fields
|
|
||||||
now := ourtime.now().unix()
|
|
||||||
mut chat_group := mydb.chat_group.new(
|
|
||||||
name: 'General Discussion'
|
|
||||||
description: 'Main channel for general team discussions'
|
|
||||||
chat_type: .public_channel
|
|
||||||
last_activity: now
|
|
||||||
is_archived: false
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['general', 'team', 'discussion']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create chat group: ${err}') }
|
|
||||||
|
|
||||||
// Verify the chat group was created with correct values
|
|
||||||
assert chat_group.name == 'General Discussion'
|
|
||||||
assert chat_group.description == 'Main channel for general team discussions'
|
|
||||||
assert chat_group.chat_type == .public_channel
|
|
||||||
assert chat_group.last_activity == now
|
|
||||||
assert chat_group.is_archived == false
|
|
||||||
assert chat_group.id == 0 // Should be 0 before saving
|
|
||||||
assert chat_group.updated_at > 0 // Should have timestamp
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_chat_group_set_and_get() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create a chat group
|
|
||||||
now := ourtime.now().unix()
|
|
||||||
mut chat_group := mydb.chat_group.new(
|
|
||||||
name: 'Development Team'
|
|
||||||
description: 'Private channel for development team coordination'
|
|
||||||
chat_type: .private_channel
|
|
||||||
last_activity: now - 3600 // 1 hour ago
|
|
||||||
is_archived: false
|
|
||||||
securitypolicy: 2
|
|
||||||
tags: ['development', 'private', 'team']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create chat group: ${err}') }
|
|
||||||
|
|
||||||
// Save the chat group
|
|
||||||
chat_group = mydb.chat_group.set(chat_group) or { panic('Failed to save chat group: ${err}') }
|
|
||||||
|
|
||||||
// Verify ID was assigned
|
|
||||||
assert chat_group.id > 0
|
|
||||||
original_id := chat_group.id
|
|
||||||
|
|
||||||
// Retrieve the chat group
|
|
||||||
retrieved_group := mydb.chat_group.get(chat_group.id) or {
|
|
||||||
panic('Failed to get chat group: ${err}')
|
|
||||||
}
|
|
||||||
|
|
||||||
// Verify all fields match
|
|
||||||
assert retrieved_group.id == original_id
|
|
||||||
assert retrieved_group.name == 'Development Team'
|
|
||||||
assert retrieved_group.description == 'Private channel for development team coordination'
|
|
||||||
assert retrieved_group.chat_type == .private_channel
|
|
||||||
assert retrieved_group.last_activity == now - 3600
|
|
||||||
assert retrieved_group.is_archived == false
|
|
||||||
assert retrieved_group.created_at > 0
|
|
||||||
assert retrieved_group.updated_at > 0
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_chat_group_types() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Test all chat types
|
|
||||||
chat_types := [ChatType.public_channel, .private_channel, .direct_message, .group_message]
|
|
||||||
now := ourtime.now().unix()
|
|
||||||
|
|
||||||
for chat_type in chat_types {
|
|
||||||
mut chat_group := mydb.chat_group.new(
|
|
||||||
name: 'Chat ${chat_type}'
|
|
||||||
description: 'Testing ${chat_type} type'
|
|
||||||
chat_type: ChatType(chat_type)
|
|
||||||
last_activity: now
|
|
||||||
is_archived: false
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['test']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create chat group with type ${chat_type}: ${err}') }
|
|
||||||
|
|
||||||
chat_group = mydb.chat_group.set(chat_group) or {
|
|
||||||
panic('Failed to save chat group with type ${chat_type}: ${err}')
|
|
||||||
}
|
|
||||||
|
|
||||||
retrieved_group := mydb.chat_group.get(chat_group.id) or {
|
|
||||||
panic('Failed to get chat group with type ${chat_type}: ${err}')
|
|
||||||
}
|
|
||||||
assert retrieved_group.chat_type == ChatType(chat_type)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_chat_group_archive() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create an active chat group
|
|
||||||
now := ourtime.now().unix()
|
|
||||||
mut chat_group := mydb.chat_group.new(
|
|
||||||
name: 'Old Project Channel'
|
|
||||||
description: 'Channel for a completed project'
|
|
||||||
chat_type: .public_channel
|
|
||||||
last_activity: now - 86400 * 30 // 30 days ago
|
|
||||||
is_archived: false
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['project', 'completed']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create chat group: ${err}') }
|
|
||||||
|
|
||||||
chat_group = mydb.chat_group.set(chat_group) or { panic('Failed to save chat group: ${err}') }
|
|
||||||
|
|
||||||
// Verify it's not archived initially
|
|
||||||
retrieved_group := mydb.chat_group.get(chat_group.id) or {
|
|
||||||
panic('Failed to get chat group: ${err}')
|
|
||||||
}
|
|
||||||
assert retrieved_group.is_archived == false
|
|
||||||
|
|
||||||
// Archive the chat group
|
|
||||||
chat_group.is_archived = true
|
|
||||||
chat_group = mydb.chat_group.set(chat_group) or { panic('Failed to update chat group: ${err}') }
|
|
||||||
|
|
||||||
// Verify it's now archived
|
|
||||||
archived_group := mydb.chat_group.get(chat_group.id) or {
|
|
||||||
panic('Failed to get archived chat group: ${err}')
|
|
||||||
}
|
|
||||||
assert archived_group.is_archived == true
|
|
||||||
assert archived_group.name == 'Old Project Channel'
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_chat_group_update() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create and save a chat group
|
|
||||||
now := ourtime.now().unix()
|
|
||||||
mut chat_group := mydb.chat_group.new(
|
|
||||||
name: 'Original Channel'
|
|
||||||
description: 'Original description'
|
|
||||||
chat_type: .public_channel
|
|
||||||
last_activity: now - 86400
|
|
||||||
is_archived: false
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['original']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create chat group: ${err}') }
|
|
||||||
|
|
||||||
chat_group = mydb.chat_group.set(chat_group) or { panic('Failed to save chat group: ${err}') }
|
|
||||||
original_id := chat_group.id
|
|
||||||
original_created_at := chat_group.created_at
|
|
||||||
original_updated_at := chat_group.updated_at
|
|
||||||
|
|
||||||
// Update the chat group
|
|
||||||
chat_group.name = 'Updated Channel'
|
|
||||||
chat_group.description = 'Updated description'
|
|
||||||
chat_group.chat_type = .private_channel
|
|
||||||
chat_group.last_activity = now
|
|
||||||
chat_group.is_archived = true
|
|
||||||
|
|
||||||
chat_group = mydb.chat_group.set(chat_group) or { panic('Failed to update chat group: ${err}') }
|
|
||||||
|
|
||||||
// Verify ID remains the same and updated_at is set
|
|
||||||
assert chat_group.id == original_id
|
|
||||||
assert chat_group.created_at == original_created_at
|
|
||||||
assert chat_group.updated_at >= original_updated_at
|
|
||||||
|
|
||||||
// Retrieve and verify updates
|
|
||||||
updated_group := mydb.chat_group.get(chat_group.id) or {
|
|
||||||
panic('Failed to get updated chat group: ${err}')
|
|
||||||
}
|
|
||||||
assert updated_group.name == 'Updated Channel'
|
|
||||||
assert updated_group.description == 'Updated description'
|
|
||||||
assert updated_group.chat_type == .private_channel
|
|
||||||
assert updated_group.last_activity == now
|
|
||||||
assert updated_group.is_archived == true
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_chat_group_exist() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Test non-existent chat group
|
|
||||||
exists := mydb.chat_group.exist(999) or { panic('Failed to check existence: ${err}') }
|
|
||||||
assert exists == false
|
|
||||||
|
|
||||||
// Create and save a chat group
|
|
||||||
mut chat_group := mydb.chat_group.new(
|
|
||||||
name: 'Existence Test'
|
|
||||||
description: 'Testing existence'
|
|
||||||
chat_type: .direct_message
|
|
||||||
last_activity: ourtime.now().unix()
|
|
||||||
is_archived: false
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['test']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create chat group: ${err}') }
|
|
||||||
|
|
||||||
chat_group = mydb.chat_group.set(chat_group) or { panic('Failed to save chat group: ${err}') }
|
|
||||||
|
|
||||||
// Test existing chat group
|
|
||||||
exists_after_save := mydb.chat_group.exist(chat_group.id) or {
|
|
||||||
panic('Failed to check existence: ${err}')
|
|
||||||
}
|
|
||||||
assert exists_after_save == true
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_chat_group_delete() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create and save a chat group
|
|
||||||
mut chat_group := mydb.chat_group.new(
|
|
||||||
name: 'To Be Deleted'
|
|
||||||
description: 'This chat group will be deleted'
|
|
||||||
chat_type: .group_message
|
|
||||||
last_activity: ourtime.now().unix()
|
|
||||||
is_archived: true
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: []
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create chat group: ${err}') }
|
|
||||||
|
|
||||||
chat_group = mydb.chat_group.set(chat_group) or { panic('Failed to save chat group: ${err}') }
|
|
||||||
group_id := chat_group.id
|
|
||||||
|
|
||||||
// Verify it exists
|
|
||||||
exists_before := mydb.chat_group.exist(group_id) or {
|
|
||||||
panic('Failed to check existence: ${err}')
|
|
||||||
}
|
|
||||||
assert exists_before == true
|
|
||||||
|
|
||||||
// Delete the chat group
|
|
||||||
mydb.chat_group.delete(group_id) or { panic('Failed to delete chat group: ${err}') }
|
|
||||||
|
|
||||||
// Verify it no longer exists
|
|
||||||
exists_after := mydb.chat_group.exist(group_id) or {
|
|
||||||
panic('Failed to check existence: ${err}')
|
|
||||||
}
|
|
||||||
assert exists_after == false
|
|
||||||
|
|
||||||
// Verify get fails
|
|
||||||
if _ := mydb.chat_group.get(group_id) {
|
|
||||||
panic('Should not be able to get deleted chat group')
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_chat_group_list() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Clear any existing chat groups by creating a fresh DB
|
|
||||||
mydb = heromodels.new() or { panic('Failed to create fresh DB: ${err}') }
|
|
||||||
|
|
||||||
// Initially should be empty
|
|
||||||
initial_list := mydb.chat_group.list() or { panic('Failed to list chat groups: ${err}') }
|
|
||||||
initial_count := initial_list.len
|
|
||||||
|
|
||||||
// Create multiple chat groups
|
|
||||||
now := ourtime.now().unix()
|
|
||||||
mut group1 := mydb.chat_group.new(
|
|
||||||
name: 'Public Channel'
|
|
||||||
description: 'Public discussion channel'
|
|
||||||
chat_type: .public_channel
|
|
||||||
last_activity: now
|
|
||||||
is_archived: false
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['public', 'discussion']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create group1: ${err}') }
|
|
||||||
|
|
||||||
mut group2 := mydb.chat_group.new(
|
|
||||||
name: 'Private Team Chat'
|
|
||||||
description: 'Private team coordination'
|
|
||||||
chat_type: .private_channel
|
|
||||||
last_activity: now - 3600
|
|
||||||
is_archived: false
|
|
||||||
securitypolicy: 2
|
|
||||||
tags: ['private', 'team']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create group2: ${err}') }
|
|
||||||
|
|
||||||
// Save both chat groups
|
|
||||||
group1 = mydb.chat_group.set(group1) or { panic('Failed to save group1: ${err}') }
|
|
||||||
group2 = mydb.chat_group.set(group2) or { panic('Failed to save group2: ${err}') }
|
|
||||||
|
|
||||||
// List chat groups
|
|
||||||
group_list := mydb.chat_group.list() or { panic('Failed to list chat groups: ${err}') }
|
|
||||||
|
|
||||||
// Should have 2 more chat groups than initially
|
|
||||||
assert group_list.len == initial_count + 2
|
|
||||||
|
|
||||||
// Find our chat groups in the list
|
|
||||||
mut found_group1 := false
|
|
||||||
mut found_group2 := false
|
|
||||||
|
|
||||||
for grp in group_list {
|
|
||||||
if grp.name == 'Public Channel' {
|
|
||||||
found_group1 = true
|
|
||||||
assert grp.chat_type == .public_channel
|
|
||||||
assert grp.is_archived == false
|
|
||||||
}
|
|
||||||
if grp.name == 'Private Team Chat' {
|
|
||||||
found_group2 = true
|
|
||||||
assert grp.chat_type == .private_channel
|
|
||||||
assert grp.is_archived == false
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
assert found_group1 == true
|
|
||||||
assert found_group2 == true
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_chat_group_edge_cases() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Test chat group with empty fields
|
|
||||||
mut empty_group := mydb.chat_group.new(
|
|
||||||
name: ''
|
|
||||||
description: ''
|
|
||||||
chat_type: .direct_message
|
|
||||||
last_activity: 0
|
|
||||||
is_archived: false
|
|
||||||
securitypolicy: 0
|
|
||||||
tags: []
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create empty chat group: ${err}') }
|
|
||||||
|
|
||||||
empty_group = mydb.chat_group.set(empty_group) or { panic('Failed to save empty chat group: ${err}') }
|
|
||||||
|
|
||||||
retrieved_empty := mydb.chat_group.get(empty_group.id) or {
|
|
||||||
panic('Failed to get empty chat group: ${err}')
|
|
||||||
}
|
|
||||||
assert retrieved_empty.name == ''
|
|
||||||
assert retrieved_empty.description == ''
|
|
||||||
assert retrieved_empty.last_activity == 0
|
|
||||||
assert retrieved_empty.is_archived == false
|
|
||||||
|
|
||||||
// Test archived direct message
|
|
||||||
now := ourtime.now().unix()
|
|
||||||
mut dm_group := mydb.chat_group.new(
|
|
||||||
name: 'DM: Alice & Bob'
|
|
||||||
description: 'Direct message between Alice and Bob'
|
|
||||||
chat_type: .direct_message
|
|
||||||
last_activity: now - 86400 * 7 // 1 week ago
|
|
||||||
is_archived: true
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['dm', 'archived']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create DM group: ${err}') }
|
|
||||||
|
|
||||||
dm_group = mydb.chat_group.set(dm_group) or { panic('Failed to save DM group: ${err}') }
|
|
||||||
|
|
||||||
retrieved_dm := mydb.chat_group.get(dm_group.id) or { panic('Failed to get DM group: ${err}') }
|
|
||||||
assert retrieved_dm.chat_type == .direct_message
|
|
||||||
assert retrieved_dm.is_archived == true
|
|
||||||
assert retrieved_dm.last_activity == now - 86400 * 7
|
|
||||||
}
|
|
||||||
@@ -63,6 +63,15 @@ pub mut:
|
|||||||
db &db.DB @[skip; str: skip]
|
db &db.DB @[skip; str: skip]
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@[params]
|
||||||
|
pub struct ChatMessageListArg {
|
||||||
|
pub mut:
|
||||||
|
chat_group_id u32
|
||||||
|
message_type MessageType
|
||||||
|
status MessageStatus
|
||||||
|
limit int = 100 // Default limit is 100
|
||||||
|
}
|
||||||
|
|
||||||
pub fn (self ChatMessage) type_name() string {
|
pub fn (self ChatMessage) type_name() string {
|
||||||
return 'chat_message'
|
return 'chat_message'
|
||||||
}
|
}
|
||||||
@@ -115,7 +124,7 @@ pub fn (self ChatMessage) example(methodname string) (string, string) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn (self ChatMessage) dump(mut e encoder.Encoder) ! {
|
fn (self ChatMessage) dump(mut e encoder.Encoder) ! {
|
||||||
e.add_string(self.content)
|
e.add_string(self.content)
|
||||||
e.add_u32(self.chat_group_id)
|
e.add_u32(self.chat_group_id)
|
||||||
e.add_u32(self.sender_id)
|
e.add_u32(self.sender_id)
|
||||||
@@ -246,6 +255,44 @@ pub fn (mut self DBChatMessage) get(id u32) !ChatMessage {
|
|||||||
return o
|
return o
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn (mut self DBChatMessage) list() ![]ChatMessage {
|
pub fn (mut self DBChatMessage) list(args ChatMessageListArg) ![]ChatMessage {
|
||||||
return self.db.list[ChatMessage]()!.map(self.get(it)!)
|
// Require at least one parameter to be provided
|
||||||
|
if args.chat_group_id == 0 && args.message_type == .text && args.status == .sent {
|
||||||
|
return error('At least one filter parameter must be provided')
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get all chat messages from the database
|
||||||
|
all_chat_messages := self.db.list[ChatMessage]()!.map(self.get(it)!)
|
||||||
|
|
||||||
|
// Apply filters
|
||||||
|
mut filtered_chat_messages := []ChatMessage{}
|
||||||
|
for chat_message in all_chat_messages {
|
||||||
|
// Filter by chat_group_id if provided
|
||||||
|
if args.chat_group_id != 0 && chat_message.chat_group_id != args.chat_group_id {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
// Filter by message_type if provided (message_type is not text)
|
||||||
|
if args.message_type != .text && chat_message.message_type != args.message_type {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
// Filter by status if provided (status is not sent)
|
||||||
|
if args.status != .sent && chat_message.status != args.status {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
filtered_chat_messages << chat_message
|
||||||
|
}
|
||||||
|
|
||||||
|
// Limit results to 100 or the specified limit
|
||||||
|
limit := args.limit
|
||||||
|
if limit > 100 {
|
||||||
|
limit = 100
|
||||||
|
}
|
||||||
|
if filtered_chat_messages.len > limit {
|
||||||
|
return filtered_chat_messages[..limit]
|
||||||
|
}
|
||||||
|
|
||||||
|
return filtered_chat_messages
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,396 +0,0 @@
|
|||||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused test
|
|
||||||
|
|
||||||
module heromodels
|
|
||||||
|
|
||||||
import freeflowuniverse.herolib.hero.heromodels
|
|
||||||
import freeflowuniverse.herolib.data.ourtime
|
|
||||||
|
|
||||||
// Test ChatMessage model CRUD operations
|
|
||||||
fn test_chat_message_new() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
now := ourtime.now().unix()
|
|
||||||
|
|
||||||
// Test creating a new chat message with all fields
|
|
||||||
mut message := mydb.chat_message.new(
|
|
||||||
name: 'MSG-001'
|
|
||||||
description: 'Important team announcement'
|
|
||||||
content: 'Hello team! We have an important announcement about the upcoming release.'
|
|
||||||
chat_group_id: 1
|
|
||||||
sender_id: 123
|
|
||||||
parent_messages: [
|
|
||||||
MessageLink{
|
|
||||||
message_id: 100
|
|
||||||
link_type: .reply
|
|
||||||
},
|
|
||||||
]
|
|
||||||
fs_files: [u32(200), 300]
|
|
||||||
message_type: .text
|
|
||||||
status: .sent
|
|
||||||
reactions: [
|
|
||||||
MessageReaction{
|
|
||||||
user_id: 456
|
|
||||||
emoji: '👍'
|
|
||||||
timestamp: now
|
|
||||||
},
|
|
||||||
MessageReaction{
|
|
||||||
user_id: 789
|
|
||||||
emoji: '❤️'
|
|
||||||
timestamp: now
|
|
||||||
},
|
|
||||||
]
|
|
||||||
mentions: [u32(456), 789, 101]
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['announcement', 'important']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create chat message: ${err}') }
|
|
||||||
|
|
||||||
// Verify the message was created with correct values
|
|
||||||
assert message.name == 'MSG-001'
|
|
||||||
assert message.description == 'Important team announcement'
|
|
||||||
assert message.content == 'Hello team! We have an important announcement about the upcoming release.'
|
|
||||||
assert message.chat_group_id == 1
|
|
||||||
assert message.sender_id == 123
|
|
||||||
assert message.parent_messages.len == 1
|
|
||||||
assert message.parent_messages[0].message_id == 100
|
|
||||||
assert message.parent_messages[0].link_type == .reply
|
|
||||||
assert message.fs_files.len == 2
|
|
||||||
assert message.fs_files[0] == 200
|
|
||||||
assert message.message_type == .text
|
|
||||||
assert message.status == .sent
|
|
||||||
assert message.reactions.len == 2
|
|
||||||
assert message.reactions[0].user_id == 456
|
|
||||||
assert message.reactions[0].emoji == '👍'
|
|
||||||
assert message.mentions.len == 3
|
|
||||||
assert message.mentions[0] == 456
|
|
||||||
assert message.id == 0 // Should be 0 before saving
|
|
||||||
assert message.updated_at > 0 // Should have timestamp
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_chat_message_set_and_get() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create a chat message
|
|
||||||
mut message := mydb.chat_message.new(
|
|
||||||
name: 'MSG-002'
|
|
||||||
description: 'Simple text message'
|
|
||||||
content: 'Hey everyone, how is the project going?'
|
|
||||||
chat_group_id: 2
|
|
||||||
sender_id: 456
|
|
||||||
parent_messages: []MessageLink{}
|
|
||||||
fs_files: []u32{}
|
|
||||||
message_type: .text
|
|
||||||
status: .delivered
|
|
||||||
reactions: []MessageReaction{}
|
|
||||||
mentions: []u32{}
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['casual', 'question']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create chat message: ${err}') }
|
|
||||||
|
|
||||||
// Save the message
|
|
||||||
message = mydb.chat_message.set(message) or { panic('Failed to save chat message: ${err}') }
|
|
||||||
|
|
||||||
// Verify ID was assigned
|
|
||||||
assert message.id > 0
|
|
||||||
original_id := message.id
|
|
||||||
|
|
||||||
// Retrieve the message
|
|
||||||
retrieved_message := mydb.chat_message.get(message.id) or {
|
|
||||||
panic('Failed to get chat message: ${err}')
|
|
||||||
}
|
|
||||||
|
|
||||||
// Verify all fields match
|
|
||||||
assert retrieved_message.id == original_id
|
|
||||||
assert retrieved_message.name == 'MSG-002'
|
|
||||||
assert retrieved_message.description == 'Simple text message'
|
|
||||||
assert retrieved_message.content == 'Hey everyone, how is the project going?'
|
|
||||||
assert retrieved_message.chat_group_id == 2
|
|
||||||
assert retrieved_message.sender_id == 456
|
|
||||||
assert retrieved_message.parent_messages.len == 0
|
|
||||||
assert retrieved_message.fs_files.len == 0
|
|
||||||
assert retrieved_message.message_type == .text
|
|
||||||
assert retrieved_message.status == .delivered
|
|
||||||
assert retrieved_message.reactions.len == 0
|
|
||||||
assert retrieved_message.mentions.len == 0
|
|
||||||
assert retrieved_message.created_at > 0
|
|
||||||
assert retrieved_message.updated_at > 0
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_chat_message_types_and_status() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Test different message types
|
|
||||||
message_types := [heromodels.MessageType.text, .image, .file, .system]
|
|
||||||
statuses := [heromodels.MessageStatus.sent, .delivered, .read, .failed, .deleted]
|
|
||||||
|
|
||||||
for i, msg_type in message_types {
|
|
||||||
status := statuses[i % statuses.len]
|
|
||||||
|
|
||||||
mut message := mydb.chat_message.new(
|
|
||||||
name: 'TEST-${i}'
|
|
||||||
description: 'Testing ${msg_type} with ${status} status'
|
|
||||||
content: 'Test message content for ${msg_type}'
|
|
||||||
chat_group_id: 1
|
|
||||||
sender_id: u32(i + 1)
|
|
||||||
parent_messages: []MessageLink{}
|
|
||||||
fs_files: []u32{}
|
|
||||||
message_type: heromodels.MessageType(msg_type)
|
|
||||||
status: heromodels.MessageStatus(status)
|
|
||||||
reactions: []MessageReaction{}
|
|
||||||
mentions: []u32{}
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['test']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create message with type ${msg_type}: ${err}') }
|
|
||||||
|
|
||||||
message = mydb.chat_message.set(message) or {
|
|
||||||
panic('Failed to save message with type ${msg_type}: ${err}')
|
|
||||||
}
|
|
||||||
|
|
||||||
retrieved_message := mydb.chat_message.get(message.id) or {
|
|
||||||
panic('Failed to get message with type ${msg_type}: ${err}')
|
|
||||||
}
|
|
||||||
assert retrieved_message.message_type == heromodels.MessageType(msg_type)
|
|
||||||
assert retrieved_message.status == heromodels.MessageStatus(status)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_chat_message_reactions() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
now := ourtime.now().unix()
|
|
||||||
|
|
||||||
// Create message with multiple reactions
|
|
||||||
mut message := mydb.chat_message.new(
|
|
||||||
name: 'REACT-MSG'
|
|
||||||
description: 'Message with reactions'
|
|
||||||
content: 'This is a great idea! 🎉'
|
|
||||||
chat_group_id: 1
|
|
||||||
sender_id: 100
|
|
||||||
parent_messages: []MessageLink{}
|
|
||||||
fs_files: []u32{}
|
|
||||||
message_type: .text
|
|
||||||
status: .read
|
|
||||||
reactions: [
|
|
||||||
MessageReaction{
|
|
||||||
user_id: 101
|
|
||||||
emoji: '👍'
|
|
||||||
timestamp: now
|
|
||||||
},
|
|
||||||
MessageReaction{
|
|
||||||
user_id: 102
|
|
||||||
emoji: '❤️'
|
|
||||||
timestamp: now
|
|
||||||
},
|
|
||||||
MessageReaction{
|
|
||||||
user_id: 103
|
|
||||||
emoji: '😂'
|
|
||||||
timestamp: now
|
|
||||||
},
|
|
||||||
MessageReaction{
|
|
||||||
user_id: 104
|
|
||||||
emoji: '👍'
|
|
||||||
timestamp: now
|
|
||||||
},
|
|
||||||
]
|
|
||||||
mentions: []u32{}
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['positive', 'reactions']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create message with reactions: ${err}') }
|
|
||||||
|
|
||||||
message = mydb.chat_message.set(message) or { panic('Failed to save message with reactions: ${err}') }
|
|
||||||
|
|
||||||
retrieved_message := mydb.chat_message.get(message.id) or {
|
|
||||||
panic('Failed to get message with reactions: ${err}')
|
|
||||||
}
|
|
||||||
|
|
||||||
// Verify all reactions are preserved
|
|
||||||
assert retrieved_message.reactions.len == 4
|
|
||||||
|
|
||||||
// Count reaction types
|
|
||||||
mut thumbs_up_count := 0
|
|
||||||
mut heart_count := 0
|
|
||||||
mut laugh_count := 0
|
|
||||||
|
|
||||||
for reaction in retrieved_message.reactions {
|
|
||||||
match reaction.emoji {
|
|
||||||
'👍' { thumbs_up_count++ }
|
|
||||||
'❤️' { heart_count++ }
|
|
||||||
'😂' { laugh_count++ }
|
|
||||||
else {}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
assert thumbs_up_count == 2
|
|
||||||
assert heart_count == 1
|
|
||||||
assert laugh_count == 1
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_chat_message_thread() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create original message
|
|
||||||
mut original_message := mydb.chat_message.new(
|
|
||||||
name: 'THREAD-ORIGINAL'
|
|
||||||
description: 'Original message in thread'
|
|
||||||
content: 'What do you think about the new feature proposal?'
|
|
||||||
chat_group_id: 1
|
|
||||||
sender_id: 200
|
|
||||||
parent_messages: []MessageLink{}
|
|
||||||
fs_files: []u32{}
|
|
||||||
message_type: .text
|
|
||||||
status: .read
|
|
||||||
reactions: []MessageReaction{}
|
|
||||||
mentions: []u32{}
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['discussion', 'feature']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create original message: ${err}') }
|
|
||||||
|
|
||||||
original_message = mydb.chat_message.set(original_message) or {
|
|
||||||
panic('Failed to save original message: ${err}')
|
|
||||||
}
|
|
||||||
original_id := original_message.id
|
|
||||||
|
|
||||||
// Create reply message
|
|
||||||
mut reply_message := mydb.chat_message.new(
|
|
||||||
name: 'THREAD-REPLY-1'
|
|
||||||
description: 'Reply to original message'
|
|
||||||
content: "I think it's a great idea! We should implement it."
|
|
||||||
chat_group_id: 1
|
|
||||||
sender_id: 201
|
|
||||||
parent_messages: [
|
|
||||||
MessageLink{
|
|
||||||
message_id: original_id
|
|
||||||
link_type: .reply
|
|
||||||
},
|
|
||||||
]
|
|
||||||
fs_files: []u32{}
|
|
||||||
message_type: .text
|
|
||||||
status: .sent
|
|
||||||
reactions: []MessageReaction{}
|
|
||||||
mentions: [u32(200)] // Mention original sender
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['reply', 'positive']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create reply message: ${err}') }
|
|
||||||
|
|
||||||
reply_message = mydb.chat_message.set(reply_message) or { panic('Failed to save reply message: ${err}') }
|
|
||||||
|
|
||||||
// Create another reply
|
|
||||||
mut reply2_message := mydb.chat_message.new(
|
|
||||||
name: 'THREAD-REPLY-2'
|
|
||||||
description: 'Second reply to original message'
|
|
||||||
content: "I agree with @user201. Let's schedule a meeting to discuss details."
|
|
||||||
chat_group_id: 1
|
|
||||||
sender_id: 202
|
|
||||||
parent_messages: [
|
|
||||||
MessageLink{
|
|
||||||
message_id: original_id
|
|
||||||
link_type: .reply
|
|
||||||
},
|
|
||||||
]
|
|
||||||
fs_files: []u32{}
|
|
||||||
message_type: .text
|
|
||||||
status: .delivered
|
|
||||||
reactions: []MessageReaction{}
|
|
||||||
mentions: [u32(200), 201] // Mention both previous users
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['reply', 'meeting']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create second reply message: ${err}') }
|
|
||||||
|
|
||||||
reply2_message = mydb.chat_message.set(reply2_message) or {
|
|
||||||
panic('Failed to save second reply message: ${err}')
|
|
||||||
}
|
|
||||||
|
|
||||||
// Verify thread structure
|
|
||||||
retrieved_original := mydb.chat_message.get(original_id) or {
|
|
||||||
panic('Failed to get original message: ${err}')
|
|
||||||
}
|
|
||||||
retrieved_reply1 := mydb.chat_message.get(reply_message.id) or {
|
|
||||||
panic('Failed to get first reply: ${err}')
|
|
||||||
}
|
|
||||||
retrieved_reply2 := mydb.chat_message.get(reply2_message.id) or {
|
|
||||||
panic('Failed to get second reply: ${err}')
|
|
||||||
}
|
|
||||||
|
|
||||||
// Original message should have no parent
|
|
||||||
assert retrieved_original.parent_messages.len == 0
|
|
||||||
|
|
||||||
// Both replies should reference the original message
|
|
||||||
assert retrieved_reply1.parent_messages.len == 1
|
|
||||||
assert retrieved_reply1.parent_messages[0].message_id == original_id
|
|
||||||
assert retrieved_reply1.parent_messages[0].link_type == .reply
|
|
||||||
assert retrieved_reply1.mentions.len == 1
|
|
||||||
assert retrieved_reply1.mentions[0] == 200
|
|
||||||
|
|
||||||
assert retrieved_reply2.parent_messages.len == 1
|
|
||||||
assert retrieved_reply2.parent_messages[0].message_id == original_id
|
|
||||||
assert retrieved_reply2.parent_messages[0].link_type == .reply
|
|
||||||
assert retrieved_reply2.mentions.len == 2
|
|
||||||
assert retrieved_reply2.mentions.contains(200)
|
|
||||||
assert retrieved_reply2.mentions.contains(201)
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_chat_message_update() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
now := ourtime.now().unix()
|
|
||||||
|
|
||||||
// Create and save a message
|
|
||||||
mut message := mydb.chat_message.new(
|
|
||||||
name: 'EDIT-MSG'
|
|
||||||
description: 'Original message'
|
|
||||||
content: 'Original content'
|
|
||||||
chat_group_id: 1
|
|
||||||
sender_id: 300
|
|
||||||
parent_messages: []MessageLink{}
|
|
||||||
fs_files: []u32{}
|
|
||||||
message_type: .text
|
|
||||||
status: .sent
|
|
||||||
reactions: []MessageReaction{}
|
|
||||||
mentions: []u32{}
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['original']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create chat message: ${err}') }
|
|
||||||
|
|
||||||
message = mydb.chat_message.set(message) or { panic('Failed to save chat message: ${err}') }
|
|
||||||
original_id := message.id
|
|
||||||
original_created_at := message.created_at
|
|
||||||
original_updated_at := message.updated_at
|
|
||||||
|
|
||||||
// Update the message
|
|
||||||
message.name = 'EDIT-MSG-UPDATED'
|
|
||||||
message.description = 'Updated message'
|
|
||||||
message.content = 'Updated content - this message has been edited'
|
|
||||||
message.status = .read
|
|
||||||
message.reactions = [
|
|
||||||
MessageReaction{
|
|
||||||
user_id: 301
|
|
||||||
emoji: '👍'
|
|
||||||
timestamp: now
|
|
||||||
},
|
|
||||||
]
|
|
||||||
message.mentions = [u32(302)]
|
|
||||||
|
|
||||||
message = mydb.chat_message.set(message) or { panic('Failed to update chat message: ${err}') }
|
|
||||||
|
|
||||||
// Verify ID remains the same and updated_at is set
|
|
||||||
assert message.id == original_id
|
|
||||||
assert message.created_at == original_created_at
|
|
||||||
assert message.updated_at >= original_updated_at
|
|
||||||
|
|
||||||
// Retrieve and verify updates
|
|
||||||
updated_message := mydb.chat_message.get(message.id) or {
|
|
||||||
panic('Failed to get updated chat message: ${err}')
|
|
||||||
}
|
|
||||||
assert updated_message.name == 'EDIT-MSG-UPDATED'
|
|
||||||
assert updated_message.description == 'Updated message'
|
|
||||||
assert updated_message.content == 'Updated content - this message has been edited'
|
|
||||||
assert updated_message.status == .read
|
|
||||||
assert updated_message.reactions.len == 1
|
|
||||||
assert updated_message.reactions[0].user_id == 301
|
|
||||||
assert updated_message.mentions.len == 1
|
|
||||||
assert updated_message.mentions[0] == 302
|
|
||||||
}
|
|
||||||
@@ -22,6 +22,14 @@ pub mut:
|
|||||||
db &db.DB @[skip; str: skip]
|
db &db.DB @[skip; str: skip]
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@[params]
|
||||||
|
pub struct CommentListArg {
|
||||||
|
pub mut:
|
||||||
|
parent u32
|
||||||
|
author u32
|
||||||
|
limit int = 100 // Default limit is 100
|
||||||
|
}
|
||||||
|
|
||||||
pub fn (self Comment) type_name() string {
|
pub fn (self Comment) type_name() string {
|
||||||
return 'comments'
|
return 'comments'
|
||||||
}
|
}
|
||||||
@@ -74,7 +82,7 @@ pub fn (self Comment) example(methodname string) (string, string) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn (self Comment) dump(mut e encoder.Encoder) ! {
|
fn (self Comment) dump(mut e encoder.Encoder) ! {
|
||||||
e.add_string(self.comment)
|
e.add_string(self.comment)
|
||||||
e.add_u32(self.parent)
|
e.add_u32(self.parent)
|
||||||
e.add_u32(self.author)
|
e.add_u32(self.author)
|
||||||
@@ -125,6 +133,39 @@ pub fn (mut self DBComments) get(id u32) !Comment {
|
|||||||
return o
|
return o
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn (mut self DBComments) list() ![]Comment {
|
pub fn (mut self DBComments) list(args CommentListArg) ![]Comment {
|
||||||
return self.db.list[Comment]()!.map(self.get(it)!)
|
// Require at least one parameter to be provided
|
||||||
|
if args.parent == 0 && args.author == 0 {
|
||||||
|
return error('At least one filter parameter must be provided')
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get all comments from the database
|
||||||
|
all_comments := self.db.list[Comment]()!.map(self.get(it)!)
|
||||||
|
|
||||||
|
// Apply filters
|
||||||
|
mut filtered_comments := []Comment{}
|
||||||
|
for comment in all_comments {
|
||||||
|
// Filter by parent if provided
|
||||||
|
if args.parent != 0 && comment.parent != args.parent {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
// Filter by author if provided
|
||||||
|
if args.author != 0 && comment.author != args.author {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
filtered_comments << comment
|
||||||
|
}
|
||||||
|
|
||||||
|
// Limit results to 100 or the specified limit
|
||||||
|
limit := args.limit
|
||||||
|
if limit > 100 {
|
||||||
|
limit = 100
|
||||||
|
}
|
||||||
|
if filtered_comments.len > limit {
|
||||||
|
return filtered_comments[..limit]
|
||||||
|
}
|
||||||
|
|
||||||
|
return filtered_comments
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,276 +0,0 @@
|
|||||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused test
|
|
||||||
|
|
||||||
module heromodels
|
|
||||||
|
|
||||||
import freeflowuniverse.herolib.hero.heromodels
|
|
||||||
|
|
||||||
// Test Comment model CRUD operations
|
|
||||||
fn test_comment_new() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Test creating a new comment with all fields
|
|
||||||
mut comment := mydb.comments.new(
|
|
||||||
comment: 'This is a test comment'
|
|
||||||
parent: 0
|
|
||||||
author: 123
|
|
||||||
) or { panic('Failed to create comment: ${err}') }
|
|
||||||
|
|
||||||
// Verify the comment was created with correct values
|
|
||||||
assert comment.comment == 'This is a test comment'
|
|
||||||
assert comment.parent == 0
|
|
||||||
assert comment.author == 123
|
|
||||||
assert comment.id == 0 // Should be 0 before saving
|
|
||||||
assert comment.updated_at > 0 // Should have timestamp
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_comment_set_and_get() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create a comment
|
|
||||||
mut comment := mydb.comments.new(
|
|
||||||
comment: 'Hello, this is my first comment!'
|
|
||||||
parent: 0
|
|
||||||
author: 456
|
|
||||||
) or { panic('Failed to create comment: ${err}') }
|
|
||||||
|
|
||||||
// Save the comment
|
|
||||||
comment = mydb.comments.set(comment) or { panic('Failed to save comment: ${err}') }
|
|
||||||
|
|
||||||
// Verify ID was assigned
|
|
||||||
assert comment.id > 0
|
|
||||||
original_id := comment.id
|
|
||||||
|
|
||||||
// Retrieve the comment
|
|
||||||
retrieved_comment := mydb.comments.get(comment.id) or { panic('Failed to get comment: ${err}') }
|
|
||||||
|
|
||||||
// Verify all fields match
|
|
||||||
assert retrieved_comment.id == original_id
|
|
||||||
assert retrieved_comment.comment == 'Hello, this is my first comment!'
|
|
||||||
assert retrieved_comment.parent == 0
|
|
||||||
assert retrieved_comment.author == 456
|
|
||||||
assert retrieved_comment.created_at > 0
|
|
||||||
assert retrieved_comment.updated_at > 0
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_comment_reply() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create a parent comment
|
|
||||||
mut parent_comment := mydb.comments.new(
|
|
||||||
comment: 'This is the parent comment'
|
|
||||||
parent: 0
|
|
||||||
author: 100
|
|
||||||
) or { panic('Failed to create parent comment: ${err}') }
|
|
||||||
|
|
||||||
parent_comment = mydb.comments.set(parent_comment) or { panic('Failed to save parent comment: ${err}') }
|
|
||||||
parent_id := parent_comment.id
|
|
||||||
|
|
||||||
// Create a reply comment
|
|
||||||
mut reply_comment := mydb.comments.new(
|
|
||||||
comment: 'This is a reply to the parent comment'
|
|
||||||
parent: parent_id
|
|
||||||
author: 200
|
|
||||||
) or { panic('Failed to create reply comment: ${err}') }
|
|
||||||
|
|
||||||
reply_comment = mydb.comments.set(reply_comment) or { panic('Failed to save reply comment: ${err}') }
|
|
||||||
|
|
||||||
// Retrieve both comments
|
|
||||||
retrieved_parent := mydb.comments.get(parent_id) or { panic('Failed to get parent comment: ${err}') }
|
|
||||||
retrieved_reply := mydb.comments.get(reply_comment.id) or { panic('Failed to get reply comment: ${err}') }
|
|
||||||
|
|
||||||
// Verify parent comment
|
|
||||||
assert retrieved_parent.comment == 'This is the parent comment'
|
|
||||||
assert retrieved_parent.parent == 0
|
|
||||||
assert retrieved_parent.author == 100
|
|
||||||
|
|
||||||
// Verify reply comment
|
|
||||||
assert retrieved_reply.comment == 'This is a reply to the parent comment'
|
|
||||||
assert retrieved_reply.parent == parent_id
|
|
||||||
assert retrieved_reply.author == 200
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_comment_update() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create and save a comment
|
|
||||||
mut comment := mydb.comments.new(
|
|
||||||
comment: 'Original comment text'
|
|
||||||
parent: 0
|
|
||||||
author: 300
|
|
||||||
) or { panic('Failed to create comment: ${err}') }
|
|
||||||
|
|
||||||
comment = mydb.comments.set(comment) or { panic('Failed to save comment: ${err}') }
|
|
||||||
original_id := comment.id
|
|
||||||
original_created_at := comment.created_at
|
|
||||||
original_updated_at := comment.updated_at
|
|
||||||
|
|
||||||
// Update the comment
|
|
||||||
comment.comment = 'Updated comment text'
|
|
||||||
comment.parent = 999
|
|
||||||
comment.author = 400
|
|
||||||
|
|
||||||
comment = mydb.comments.set(comment) or { panic('Failed to update comment: ${err}') }
|
|
||||||
|
|
||||||
// Verify ID remains the same and updated_at is set
|
|
||||||
assert comment.id == original_id
|
|
||||||
assert comment.created_at == original_created_at
|
|
||||||
assert comment.updated_at >= original_updated_at
|
|
||||||
|
|
||||||
// Retrieve and verify updates
|
|
||||||
updated_comment := mydb.comments.get(comment.id) or { panic('Failed to get updated comment: ${err}') }
|
|
||||||
assert updated_comment.comment == 'Updated comment text'
|
|
||||||
assert updated_comment.parent == 999
|
|
||||||
assert updated_comment.author == 400
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_comment_exist() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Test non-existent comment
|
|
||||||
exists := mydb.comments.exist(999) or { panic('Failed to check existence: ${err}') }
|
|
||||||
assert exists == false
|
|
||||||
|
|
||||||
// Create and save a comment
|
|
||||||
mut comment := mydb.comments.new(
|
|
||||||
comment: 'Existence test comment'
|
|
||||||
parent: 0
|
|
||||||
author: 500
|
|
||||||
) or { panic('Failed to create comment: ${err}') }
|
|
||||||
|
|
||||||
comment = mydb.comments.set(comment) or { panic('Failed to save comment: ${err}') }
|
|
||||||
|
|
||||||
// Test existing comment
|
|
||||||
exists_after_save := mydb.comments.exist(comment.id) or { panic('Failed to check existence: ${err}') }
|
|
||||||
assert exists_after_save == true
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_comment_delete() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create and save a comment
|
|
||||||
mut comment := mydb.comments.new(
|
|
||||||
comment: 'This comment will be deleted'
|
|
||||||
parent: 0
|
|
||||||
author: 600
|
|
||||||
) or { panic('Failed to create comment: ${err}') }
|
|
||||||
|
|
||||||
comment = mydb.comments.set(comment) or { panic('Failed to save comment: ${err}') }
|
|
||||||
comment_id := comment.id
|
|
||||||
|
|
||||||
// Verify it exists
|
|
||||||
exists_before := mydb.comments.exist(comment_id) or { panic('Failed to check existence: ${err}') }
|
|
||||||
assert exists_before == true
|
|
||||||
|
|
||||||
// Delete the comment
|
|
||||||
mydb.comments.delete(comment_id) or { panic('Failed to delete comment: ${err}') }
|
|
||||||
|
|
||||||
// Verify it no longer exists
|
|
||||||
exists_after := mydb.comments.exist(comment_id) or { panic('Failed to check existence: ${err}') }
|
|
||||||
assert exists_after == false
|
|
||||||
|
|
||||||
// Verify get fails
|
|
||||||
if _ := mydb.comments.get(comment_id) {
|
|
||||||
panic('Should not be able to get deleted comment')
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_comment_list() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Clear any existing comments by creating a fresh DB
|
|
||||||
mydb = heromodels.new() or { panic('Failed to create fresh DB: ${err}') }
|
|
||||||
|
|
||||||
// Initially should be empty
|
|
||||||
initial_list := mydb.comments.list() or { panic('Failed to list comments: ${err}') }
|
|
||||||
initial_count := initial_list.len
|
|
||||||
|
|
||||||
// Create multiple comments
|
|
||||||
mut comment1 := mydb.comments.new(
|
|
||||||
comment: 'First comment'
|
|
||||||
parent: 0
|
|
||||||
author: 700
|
|
||||||
) or { panic('Failed to create comment1: ${err}') }
|
|
||||||
|
|
||||||
mut comment2 := mydb.comments.new(
|
|
||||||
comment: 'Second comment'
|
|
||||||
parent: 0
|
|
||||||
author: 800
|
|
||||||
) or { panic('Failed to create comment2: ${err}') }
|
|
||||||
|
|
||||||
// Save both comments
|
|
||||||
comment1 = mydb.comments.set(comment1) or { panic('Failed to save comment1: ${err}') }
|
|
||||||
comment2 = mydb.comments.set(comment2) or { panic('Failed to save comment2: ${err}') }
|
|
||||||
|
|
||||||
// List comments
|
|
||||||
comment_list := mydb.comments.list() or { panic('Failed to list comments: ${err}') }
|
|
||||||
|
|
||||||
// Should have 2 more comments than initially
|
|
||||||
assert comment_list.len == initial_count + 2
|
|
||||||
|
|
||||||
// Find our comments in the list
|
|
||||||
mut found_comment1 := false
|
|
||||||
mut found_comment2 := false
|
|
||||||
|
|
||||||
for cmt in comment_list {
|
|
||||||
if cmt.comment == 'First comment' {
|
|
||||||
found_comment1 = true
|
|
||||||
assert cmt.author == 700
|
|
||||||
assert cmt.parent == 0
|
|
||||||
}
|
|
||||||
if cmt.comment == 'Second comment' {
|
|
||||||
found_comment2 = true
|
|
||||||
assert cmt.author == 800
|
|
||||||
assert cmt.parent == 0
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
assert found_comment1 == true
|
|
||||||
assert found_comment2 == true
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_comment_edge_cases() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Test empty comment
|
|
||||||
mut empty_comment := mydb.comments.new(
|
|
||||||
comment: ''
|
|
||||||
parent: 0
|
|
||||||
author: 0
|
|
||||||
) or { panic('Failed to create empty comment: ${err}') }
|
|
||||||
|
|
||||||
empty_comment = mydb.comments.set(empty_comment) or { panic('Failed to save empty comment: ${err}') }
|
|
||||||
|
|
||||||
retrieved_empty := mydb.comments.get(empty_comment.id) or { panic('Failed to get empty comment: ${err}') }
|
|
||||||
assert retrieved_empty.comment == ''
|
|
||||||
assert retrieved_empty.parent == 0
|
|
||||||
assert retrieved_empty.author == 0
|
|
||||||
|
|
||||||
// Test very long comment
|
|
||||||
long_text := 'This is a very long comment. '.repeat(100)
|
|
||||||
mut long_comment := mydb.comments.new(
|
|
||||||
comment: long_text
|
|
||||||
parent: 12345
|
|
||||||
author: 99999
|
|
||||||
) or { panic('Failed to create long comment: ${err}') }
|
|
||||||
|
|
||||||
long_comment = mydb.comments.set(long_comment) or { panic('Failed to save long comment: ${err}') }
|
|
||||||
|
|
||||||
retrieved_long := mydb.comments.get(long_comment.id) or { panic('Failed to get long comment: ${err}') }
|
|
||||||
assert retrieved_long.comment == long_text
|
|
||||||
assert retrieved_long.parent == 12345
|
|
||||||
assert retrieved_long.author == 99999
|
|
||||||
|
|
||||||
// Test comment with special characters
|
|
||||||
special_text := 'Comment with special chars: !@#$%^&*()_+-=[]{}|;:,.<>?/~`'
|
|
||||||
mut special_comment := mydb.comments.new(
|
|
||||||
comment: special_text
|
|
||||||
parent: 0
|
|
||||||
author: 1000
|
|
||||||
) or { panic('Failed to create special comment: ${err}') }
|
|
||||||
|
|
||||||
special_comment = mydb.comments.set(special_comment) or { panic('Failed to save special comment: ${err}') }
|
|
||||||
|
|
||||||
retrieved_special := mydb.comments.get(special_comment.id) or { panic('Failed to get special comment: ${err}') }
|
|
||||||
assert retrieved_special.comment == special_text
|
|
||||||
assert retrieved_special.author == 1000
|
|
||||||
}
|
|
||||||
@@ -81,7 +81,7 @@ pub fn (self Group) example(methodname string) (string, string) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn (self Group) dump(mut e encoder.Encoder) ! {
|
fn (self Group) dump(mut e encoder.Encoder) ! {
|
||||||
e.add_u16(u16(self.members.len))
|
e.add_u16(u16(self.members.len))
|
||||||
for member in self.members {
|
for member in self.members {
|
||||||
e.add_u32(member.user_id)
|
e.add_u32(member.user_id)
|
||||||
@@ -93,7 +93,7 @@ pub fn (self Group) dump(mut e encoder.Encoder) ! {
|
|||||||
e.add_bool(self.is_public)
|
e.add_bool(self.is_public)
|
||||||
}
|
}
|
||||||
|
|
||||||
fn (mut self DBGroup) load(mut o Group, mut e encoder.Decoder) ! {
|
fn (mut o Group) load(mut e encoder.Decoder) ! {
|
||||||
members_len := e.get_u16()!
|
members_len := e.get_u16()!
|
||||||
mut members := []GroupMember{}
|
mut members := []GroupMember{}
|
||||||
for _ in 0 .. members_len {
|
for _ in 0 .. members_len {
|
||||||
@@ -130,6 +130,14 @@ pub mut:
|
|||||||
db &db.DB @[skip; str: skip]
|
db &db.DB @[skip; str: skip]
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@[params]
|
||||||
|
pub struct GroupListArg {
|
||||||
|
pub mut:
|
||||||
|
is_public bool
|
||||||
|
parent_group u32
|
||||||
|
limit int = 100 // Default limit is 100
|
||||||
|
}
|
||||||
|
|
||||||
// get new group, not from the DB
|
// get new group, not from the DB
|
||||||
pub fn (mut self DBGroup) new(args GroupArg) !Group {
|
pub fn (mut self DBGroup) new(args GroupArg) !Group {
|
||||||
mut o := Group{
|
mut o := Group{
|
||||||
@@ -162,12 +170,49 @@ pub fn (mut self DBGroup) exist(id u32) !bool {
|
|||||||
pub fn (mut self DBGroup) get(id u32) !Group {
|
pub fn (mut self DBGroup) get(id u32) !Group {
|
||||||
mut o, data := self.db.get_data[Group](id)!
|
mut o, data := self.db.get_data[Group](id)!
|
||||||
mut e_decoder := encoder.decoder_new(data)
|
mut e_decoder := encoder.decoder_new(data)
|
||||||
self.load(mut o, mut e_decoder)!
|
o.load(mut e_decoder)!
|
||||||
return o
|
return o
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn (mut self DBGroup) list() ![]Group {
|
pub fn (mut self DBGroup) list(args GroupListArg) ![]Group {
|
||||||
return self.db.list[Group]()!.map(self.get(it)!)
|
// Require at least one parameter to be provided
|
||||||
|
if !args.is_public && args.parent_group == 0 {
|
||||||
|
return error('At least one filter parameter must be provided')
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get all groups from the database
|
||||||
|
mut all_groups := self.db.list[Group]()!
|
||||||
|
mut groups := []Group{}
|
||||||
|
for id in all_groups {
|
||||||
|
groups << self.get(id)!
|
||||||
|
}
|
||||||
|
|
||||||
|
// Apply filters
|
||||||
|
mut filtered_groups := []Group{}
|
||||||
|
for group in groups {
|
||||||
|
// Filter by is_public if provided (is_public is true)
|
||||||
|
if args.is_public && !group.is_public {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
// Filter by parent_group if provided
|
||||||
|
if args.parent_group != 0 && group.parent_group != args.parent_group {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
filtered_groups << group
|
||||||
|
}
|
||||||
|
|
||||||
|
// Limit results to 100 or the specified limit
|
||||||
|
mut limit := args.limit
|
||||||
|
if limit > 100 {
|
||||||
|
limit = 100
|
||||||
|
}
|
||||||
|
if filtered_groups.len > limit {
|
||||||
|
return filtered_groups[..limit]
|
||||||
|
}
|
||||||
|
|
||||||
|
return filtered_groups
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn (mut self Group) add_member(user_id u32, role GroupRole) {
|
pub fn (mut self Group) add_member(user_id u32, role GroupRole) {
|
||||||
|
|||||||
@@ -1,517 +1,459 @@
|
|||||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused test
|
|
||||||
|
|
||||||
module heromodels
|
module heromodels
|
||||||
|
|
||||||
import freeflowuniverse.herolib.hero.heromodels { GroupMember, GroupRole }
|
import freeflowuniverse.herolib.hero.db
|
||||||
import freeflowuniverse.herolib.data.ourtime
|
|
||||||
|
|
||||||
// Test Group model CRUD operations
|
fn test_group_new() ! {
|
||||||
fn test_group_new() {
|
// Initialize DBGroup for testing
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
mut mydb := db.new_test()!
|
||||||
|
mut db_group := DBGroup{
|
||||||
|
db: &mydb
|
||||||
|
}
|
||||||
|
|
||||||
// Test creating a new group with all fields
|
// Test creating a new group
|
||||||
now := ourtime.now().unix()
|
mut args := GroupArg{
|
||||||
mut group := mydb.group.new(
|
name: 'test_group'
|
||||||
name: 'Development Team'
|
description: 'Test group for unit testing'
|
||||||
description: 'Software development team for the main project'
|
members: []GroupMember{}
|
||||||
members: [
|
subgroups: []u32{}
|
||||||
GroupMember{
|
|
||||||
user_id: 1
|
|
||||||
role: .owner
|
|
||||||
joined_at: now
|
|
||||||
},
|
|
||||||
GroupMember{
|
|
||||||
user_id: 2
|
|
||||||
role: .admin
|
|
||||||
joined_at: now
|
|
||||||
},
|
|
||||||
GroupMember{
|
|
||||||
user_id: 3
|
|
||||||
role: .writer
|
|
||||||
joined_at: now
|
|
||||||
},
|
|
||||||
]
|
|
||||||
subgroups: [u32(10), 20, 30]
|
|
||||||
parent_group: 0
|
parent_group: 0
|
||||||
is_public: false
|
is_public: true
|
||||||
) or { panic('Failed to create group: ${err}') }
|
}
|
||||||
|
|
||||||
// Verify the group was created with correct values
|
group := db_group.new(args)!
|
||||||
assert group.name == 'Development Team'
|
|
||||||
assert group.description == 'Software development team for the main project'
|
assert group.name == 'test_group'
|
||||||
assert group.members.len == 3
|
assert group.description == 'Test group for unit testing'
|
||||||
assert group.members[0].user_id == 1
|
assert group.is_public == true
|
||||||
assert group.members[0].role == .owner
|
assert group.members.len == 0
|
||||||
assert group.members[1].user_id == 2
|
assert group.subgroups.len == 0
|
||||||
assert group.members[1].role == .admin
|
|
||||||
assert group.members[2].user_id == 3
|
|
||||||
assert group.members[2].role == .writer
|
|
||||||
assert group.subgroups.len == 3
|
|
||||||
assert group.subgroups[0] == 10
|
|
||||||
assert group.parent_group == 0
|
assert group.parent_group == 0
|
||||||
assert group.is_public == false
|
assert group.updated_at > 0
|
||||||
assert group.id == 0 // Should be 0 before saving
|
|
||||||
assert group.updated_at > 0 // Should have timestamp
|
println('✓ Group new test passed!')
|
||||||
}
|
}
|
||||||
|
|
||||||
fn test_group_set_and_get() {
|
fn test_group_crud_operations() ! {
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
// Initialize DBGroup for testing
|
||||||
|
mut mydb := db.new_test()!
|
||||||
|
mut db_group := DBGroup{
|
||||||
|
db: &mydb
|
||||||
|
}
|
||||||
|
|
||||||
// Create a group
|
// Create a new group
|
||||||
now := ourtime.now().unix()
|
mut args := GroupArg{
|
||||||
mut group := mydb.group.new(
|
name: 'crud_test_group'
|
||||||
name: 'Marketing Team'
|
description: 'Test group for CRUD operations'
|
||||||
description: 'Marketing and communications team'
|
members: []GroupMember{}
|
||||||
members: [
|
|
||||||
GroupMember{
|
|
||||||
user_id: 100
|
|
||||||
role: .owner
|
|
||||||
joined_at: now - 86400 // 1 day ago
|
|
||||||
},
|
|
||||||
GroupMember{
|
|
||||||
user_id: 101
|
|
||||||
role: .writer
|
|
||||||
joined_at: now
|
|
||||||
},
|
|
||||||
]
|
|
||||||
subgroups: []u32{}
|
subgroups: []u32{}
|
||||||
parent_group: 5
|
parent_group: 0
|
||||||
is_public: true
|
is_public: false
|
||||||
) or { panic('Failed to create group: ${err}') }
|
}
|
||||||
|
|
||||||
// Save the group
|
mut group := db_group.new(args)!
|
||||||
group = mydb.group.set(group) or { panic('Failed to save group: ${err}') }
|
|
||||||
|
|
||||||
// Verify ID was assigned
|
// Test set operation
|
||||||
assert group.id > 0
|
group = db_group.set(group)!
|
||||||
original_id := group.id
|
original_id := group.id
|
||||||
|
|
||||||
// Retrieve the group
|
// Test get operation
|
||||||
retrieved_group := mydb.group.get(group.id) or { panic('Failed to get group: ${err}') }
|
retrieved_group := db_group.get(original_id)!
|
||||||
|
assert retrieved_group.name == 'crud_test_group'
|
||||||
// Verify all fields match
|
assert retrieved_group.description == 'Test group for CRUD operations'
|
||||||
|
assert retrieved_group.is_public == false
|
||||||
assert retrieved_group.id == original_id
|
assert retrieved_group.id == original_id
|
||||||
assert retrieved_group.name == 'Marketing Team'
|
|
||||||
assert retrieved_group.description == 'Marketing and communications team'
|
|
||||||
assert retrieved_group.members.len == 2
|
|
||||||
assert retrieved_group.members[0].user_id == 100
|
|
||||||
assert retrieved_group.members[0].role == .owner
|
|
||||||
assert retrieved_group.members[1].user_id == 101
|
|
||||||
assert retrieved_group.members[1].role == .writer
|
|
||||||
assert retrieved_group.subgroups.len == 0
|
|
||||||
assert retrieved_group.parent_group == 5
|
|
||||||
assert retrieved_group.is_public == true
|
|
||||||
assert retrieved_group.created_at > 0
|
|
||||||
assert retrieved_group.updated_at > 0
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_group_roles() {
|
// Test exist operation
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
exists := db_group.exist(original_id)!
|
||||||
|
assert exists == true
|
||||||
|
|
||||||
// Test all group roles
|
// Test update
|
||||||
roles := [GroupRole.reader, .writer, .admin, .owner]
|
mut updated_args := GroupArg{
|
||||||
now := ourtime.now().unix()
|
name: 'updated_group'
|
||||||
|
description: 'Updated test group'
|
||||||
mut members := []GroupMember{}
|
|
||||||
for i, role in roles {
|
|
||||||
members << GroupMember{
|
|
||||||
user_id: u32(i + 1)
|
|
||||||
role: role
|
|
||||||
joined_at: now + i64(i * 3600) // Different join times
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
mut group := mydb.group.new(
|
|
||||||
name: 'Role Test Group'
|
|
||||||
description: 'Testing all group roles'
|
|
||||||
members: members
|
|
||||||
subgroups: []u32{}
|
|
||||||
parent_group: 0
|
|
||||||
is_public: false
|
|
||||||
) or { panic('Failed to create group: ${err}') }
|
|
||||||
|
|
||||||
group = mydb.group.set(group) or { panic('Failed to save group: ${err}') }
|
|
||||||
|
|
||||||
retrieved_group := mydb.group.get(group.id) or { panic('Failed to get group: ${err}') }
|
|
||||||
|
|
||||||
// Verify all roles are preserved
|
|
||||||
assert retrieved_group.members.len == 4
|
|
||||||
assert retrieved_group.members[0].role == .reader
|
|
||||||
assert retrieved_group.members[1].role == .writer
|
|
||||||
assert retrieved_group.members[2].role == .admin
|
|
||||||
assert retrieved_group.members[3].role == .owner
|
|
||||||
|
|
||||||
// Verify join times are preserved
|
|
||||||
for i, member in retrieved_group.members {
|
|
||||||
assert member.joined_at == now + i64(i * 3600)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_group_hierarchy() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create parent group
|
|
||||||
mut parent_group := mydb.group.new(
|
|
||||||
name: 'Engineering'
|
|
||||||
description: 'Main engineering group'
|
|
||||||
members: [
|
|
||||||
GroupMember{
|
|
||||||
user_id: 1
|
|
||||||
role: .owner
|
|
||||||
joined_at: ourtime.now().unix()
|
|
||||||
},
|
|
||||||
]
|
|
||||||
subgroups: []u32{}
|
|
||||||
parent_group: 0
|
|
||||||
is_public: false
|
|
||||||
) or { panic('Failed to create parent group: ${err}') }
|
|
||||||
|
|
||||||
parent_group = mydb.group.set(parent_group) or { panic('Failed to save parent group: ${err}') }
|
|
||||||
parent_id := parent_group.id
|
|
||||||
|
|
||||||
// Create child groups
|
|
||||||
mut frontend_group := mydb.group.new(
|
|
||||||
name: 'Frontend Team'
|
|
||||||
description: 'Frontend development team'
|
|
||||||
members: [
|
|
||||||
GroupMember{
|
|
||||||
user_id: 10
|
|
||||||
role: .admin
|
|
||||||
joined_at: ourtime.now().unix()
|
|
||||||
},
|
|
||||||
]
|
|
||||||
subgroups: []u32{}
|
|
||||||
parent_group: parent_id
|
|
||||||
is_public: false
|
|
||||||
) or { panic('Failed to create frontend group: ${err}') }
|
|
||||||
|
|
||||||
mut backend_group := mydb.group.new(
|
|
||||||
name: 'Backend Team'
|
|
||||||
description: 'Backend development team'
|
|
||||||
members: [
|
|
||||||
GroupMember{
|
|
||||||
user_id: 20
|
|
||||||
role: .admin
|
|
||||||
joined_at: ourtime.now().unix()
|
|
||||||
},
|
|
||||||
]
|
|
||||||
subgroups: []u32{}
|
|
||||||
parent_group: parent_id
|
|
||||||
is_public: false
|
|
||||||
) or { panic('Failed to create backend group: ${err}') }
|
|
||||||
|
|
||||||
frontend_group = mydb.group.set(frontend_group) or { panic('Failed to save frontend group: ${err}') }
|
|
||||||
backend_group = mydb.group.set(backend_group) or { panic('Failed to save backend group: ${err}') }
|
|
||||||
|
|
||||||
// Update parent group with subgroups
|
|
||||||
parent_group.subgroups = [frontend_group.id, backend_group.id]
|
|
||||||
parent_group = mydb.group.set(parent_group) or { panic('Failed to update parent group: ${err}') }
|
|
||||||
|
|
||||||
// Verify hierarchy
|
|
||||||
retrieved_parent := mydb.group.get(parent_id) or { panic('Failed to get parent group: ${err}') }
|
|
||||||
retrieved_frontend := mydb.group.get(frontend_group.id) or {
|
|
||||||
panic('Failed to get frontend group: ${err}')
|
|
||||||
}
|
|
||||||
retrieved_backend := mydb.group.get(backend_group.id) or {
|
|
||||||
panic('Failed to get backend group: ${err}')
|
|
||||||
}
|
|
||||||
|
|
||||||
assert retrieved_parent.subgroups.len == 2
|
|
||||||
assert retrieved_parent.subgroups.contains(frontend_group.id)
|
|
||||||
assert retrieved_parent.subgroups.contains(backend_group.id)
|
|
||||||
assert retrieved_frontend.parent_group == parent_id
|
|
||||||
assert retrieved_backend.parent_group == parent_id
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_group_update() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create and save a group
|
|
||||||
now := ourtime.now().unix()
|
|
||||||
mut group := mydb.group.new(
|
|
||||||
name: 'Original Group'
|
|
||||||
description: 'Original description'
|
|
||||||
members: [
|
|
||||||
GroupMember{
|
|
||||||
user_id: 1
|
|
||||||
role: .reader
|
|
||||||
joined_at: now
|
|
||||||
},
|
|
||||||
]
|
|
||||||
subgroups: []u32{}
|
|
||||||
parent_group: 0
|
|
||||||
is_public: false
|
|
||||||
) or { panic('Failed to create group: ${err}') }
|
|
||||||
|
|
||||||
group = mydb.group.set(group) or { panic('Failed to save group: ${err}') }
|
|
||||||
original_id := group.id
|
|
||||||
original_created_at := group.created_at
|
|
||||||
original_updated_at := group.updated_at
|
|
||||||
|
|
||||||
// Update the group
|
|
||||||
group.name = 'Updated Group'
|
|
||||||
group.description = 'Updated description'
|
|
||||||
group.members = [
|
|
||||||
GroupMember{
|
|
||||||
user_id: 1
|
|
||||||
role: .admin
|
|
||||||
joined_at: now
|
|
||||||
},
|
|
||||||
GroupMember{
|
|
||||||
user_id: 2
|
|
||||||
role: .writer
|
|
||||||
joined_at: now + 3600
|
|
||||||
},
|
|
||||||
]
|
|
||||||
group.subgroups = [u32(100), 200]
|
|
||||||
group.parent_group = 50
|
|
||||||
group.is_public = true
|
|
||||||
|
|
||||||
group = mydb.group.set(group) or { panic('Failed to update group: ${err}') }
|
|
||||||
|
|
||||||
// Verify ID remains the same and updated_at is set
|
|
||||||
assert group.id == original_id
|
|
||||||
assert group.created_at == original_created_at
|
|
||||||
assert group.updated_at >= original_updated_at
|
|
||||||
|
|
||||||
// Retrieve and verify updates
|
|
||||||
updated_group := mydb.group.get(group.id) or { panic('Failed to get updated group: ${err}') }
|
|
||||||
assert updated_group.name == 'Updated Group'
|
|
||||||
assert updated_group.description == 'Updated description'
|
|
||||||
assert updated_group.members.len == 2
|
|
||||||
assert updated_group.members[0].role == .admin
|
|
||||||
assert updated_group.members[1].role == .writer
|
|
||||||
assert updated_group.subgroups.len == 2
|
|
||||||
assert updated_group.subgroups[0] == 100
|
|
||||||
assert updated_group.parent_group == 50
|
|
||||||
assert updated_group.is_public == true
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_group_exist() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Test non-existent group with a very high ID that shouldn't exist
|
|
||||||
exists := mydb.group.exist(999999) or { panic('Failed to check existence: ${err}') }
|
|
||||||
assert exists == false
|
|
||||||
|
|
||||||
// Create and save a group
|
|
||||||
mut group := mydb.group.new(
|
|
||||||
name: 'Existence Test'
|
|
||||||
description: 'Testing existence'
|
|
||||||
members: []GroupMember{}
|
members: []GroupMember{}
|
||||||
subgroups: []u32{}
|
subgroups: []u32{}
|
||||||
parent_group: 0
|
parent_group: 0
|
||||||
is_public: true
|
is_public: true
|
||||||
) or { panic('Failed to create group: ${err}') }
|
|
||||||
|
|
||||||
group = mydb.group.set(group) or { panic('Failed to save group: ${err}') }
|
|
||||||
|
|
||||||
// Test existing group
|
|
||||||
exists_after_save := mydb.group.exist(group.id) or {
|
|
||||||
panic('Failed to check existence: ${err}')
|
|
||||||
}
|
}
|
||||||
assert exists_after_save == true
|
|
||||||
|
mut updated_group := db_group.new(updated_args)!
|
||||||
|
updated_group.id = original_id
|
||||||
|
updated_group = db_group.set(updated_group)!
|
||||||
|
|
||||||
|
// Verify update
|
||||||
|
final_group := db_group.get(original_id)!
|
||||||
|
assert final_group.name == 'updated_group'
|
||||||
|
assert final_group.description == 'Updated test group'
|
||||||
|
assert final_group.is_public == true
|
||||||
|
|
||||||
|
// Test delete operation
|
||||||
|
db_group.delete(original_id)!
|
||||||
|
|
||||||
|
// Verify deletion
|
||||||
|
exists_after_delete := db_group.exist(original_id)!
|
||||||
|
assert exists_after_delete == false
|
||||||
|
|
||||||
|
println('✓ Group CRUD operations test passed!')
|
||||||
}
|
}
|
||||||
|
|
||||||
fn test_group_delete() {
|
fn test_group_member_operations() ! {
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
// Initialize DBGroup for testing
|
||||||
|
mut mydb := db.new_test()!
|
||||||
|
mut db_group := DBGroup{
|
||||||
|
db: &mydb
|
||||||
|
}
|
||||||
|
|
||||||
// Create and save a group
|
// Create a new group
|
||||||
mut group := mydb.group.new(
|
mut args := GroupArg{
|
||||||
name: 'To Be Deleted'
|
name: 'member_test_group'
|
||||||
description: 'This group will be deleted'
|
description: 'Test group for member operations'
|
||||||
members: []GroupMember{}
|
members: []GroupMember{}
|
||||||
subgroups: []u32{}
|
subgroups: []u32{}
|
||||||
parent_group: 0
|
parent_group: 0
|
||||||
is_public: false
|
is_public: true
|
||||||
) or { panic('Failed to create group: ${err}') }
|
}
|
||||||
|
|
||||||
group = mydb.group.set(group) or { panic('Failed to save group: ${err}') }
|
mut group := db_group.new(args)!
|
||||||
|
group = db_group.set(group)!
|
||||||
group_id := group.id
|
group_id := group.id
|
||||||
|
|
||||||
// Verify it exists
|
// Test add_member
|
||||||
exists_before := mydb.group.exist(group_id) or { panic('Failed to check existence: ${err}') }
|
group.add_member(100, .admin)
|
||||||
assert exists_before == true
|
group.add_member(101, .writer)
|
||||||
|
group.add_member(102, .reader)
|
||||||
|
|
||||||
// Delete the group
|
// Save updated group
|
||||||
mydb.group.delete(group_id) or { panic('Failed to delete group: ${err}') }
|
group = db_group.set(group)!
|
||||||
|
|
||||||
// Verify it no longer exists
|
// Verify members were added
|
||||||
exists_after := mydb.group.exist(group_id) or { panic('Failed to check existence: ${err}') }
|
updated_group := db_group.get(group_id)!
|
||||||
assert exists_after == false
|
assert updated_group.members.len == 3
|
||||||
|
|
||||||
// Verify get fails
|
// Check first member
|
||||||
if _ := mydb.group.get(group_id) {
|
assert updated_group.members[0].user_id == 100
|
||||||
panic('Should not be able to get deleted group')
|
assert updated_group.members[0].role == .admin
|
||||||
}
|
|
||||||
|
// Check second member
|
||||||
|
assert updated_group.members[1].user_id == 101
|
||||||
|
assert updated_group.members[1].role == .writer
|
||||||
|
|
||||||
|
// Check third member
|
||||||
|
assert updated_group.members[2].user_id == 102
|
||||||
|
assert updated_group.members[2].role == .reader
|
||||||
|
|
||||||
|
// Verify joined_at timestamps are set
|
||||||
|
assert updated_group.members[0].joined_at > 0
|
||||||
|
assert updated_group.members[1].joined_at > 0
|
||||||
|
assert updated_group.members[2].joined_at > 0
|
||||||
|
|
||||||
|
println('✓ Group member operations test passed!')
|
||||||
}
|
}
|
||||||
|
|
||||||
fn test_group_list() {
|
fn test_group_hierarchy_operations() ! {
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
// Initialize DBGroup for testing
|
||||||
|
mut mydb := db.new_test()!
|
||||||
// Clear any existing groups by creating a fresh DB
|
mut db_group := DBGroup{
|
||||||
mydb = heromodels.new() or { panic('Failed to create fresh DB: ${err}') }
|
db: &mydb
|
||||||
|
|
||||||
// Initially should be empty
|
|
||||||
initial_list := mydb.group.list() or { panic('Failed to list groups: ${err}') }
|
|
||||||
initial_count := initial_list.len
|
|
||||||
|
|
||||||
// Create multiple groups
|
|
||||||
now := ourtime.now().unix()
|
|
||||||
mut group1 := mydb.group.new(
|
|
||||||
name: 'Group One'
|
|
||||||
description: 'First test group'
|
|
||||||
members: [
|
|
||||||
GroupMember{
|
|
||||||
user_id: 1
|
|
||||||
role: .owner
|
|
||||||
joined_at: now
|
|
||||||
},
|
|
||||||
]
|
|
||||||
subgroups: []u32{}
|
|
||||||
parent_group: 0
|
|
||||||
is_public: true
|
|
||||||
) or { panic('Failed to create group1: ${err}') }
|
|
||||||
|
|
||||||
mut group2 := mydb.group.new(
|
|
||||||
name: 'Group Two'
|
|
||||||
description: 'Second test group'
|
|
||||||
members: [
|
|
||||||
GroupMember{
|
|
||||||
user_id: 2
|
|
||||||
role: .admin
|
|
||||||
joined_at: now
|
|
||||||
},
|
|
||||||
GroupMember{
|
|
||||||
user_id: 3
|
|
||||||
role: .writer
|
|
||||||
joined_at: now + 3600
|
|
||||||
},
|
|
||||||
]
|
|
||||||
subgroups: [u32(10)]
|
|
||||||
parent_group: 0
|
|
||||||
is_public: false
|
|
||||||
) or { panic('Failed to create group2: ${err}') }
|
|
||||||
|
|
||||||
// Save both groups
|
|
||||||
group1 = mydb.group.set(group1) or { panic('Failed to save group1: ${err}') }
|
|
||||||
group2 = mydb.group.set(group2) or { panic('Failed to save group2: ${err}') }
|
|
||||||
|
|
||||||
// List groups
|
|
||||||
group_list := mydb.group.list() or { panic('Failed to list groups: ${err}') }
|
|
||||||
|
|
||||||
// Should have 2 more groups than initially
|
|
||||||
assert group_list.len == initial_count + 2
|
|
||||||
|
|
||||||
// Find our groups in the list
|
|
||||||
mut found_group1 := false
|
|
||||||
mut found_group2 := false
|
|
||||||
|
|
||||||
for grp in group_list {
|
|
||||||
if grp.name == 'Group One' {
|
|
||||||
found_group1 = true
|
|
||||||
assert grp.is_public == true
|
|
||||||
assert grp.members.len == 1
|
|
||||||
assert grp.members[0].role == .owner
|
|
||||||
}
|
|
||||||
if grp.name == 'Group Two' {
|
|
||||||
found_group2 = true
|
|
||||||
assert grp.is_public == false
|
|
||||||
assert grp.members.len == 2
|
|
||||||
assert grp.subgroups.len == 1
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
assert found_group1 == true
|
// Create parent group
|
||||||
assert found_group2 == true
|
mut parent_args := GroupArg{
|
||||||
}
|
name: 'parent_group'
|
||||||
|
description: 'Parent group'
|
||||||
fn test_group_edge_cases() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Test group with no members
|
|
||||||
mut empty_group := mydb.group.new(
|
|
||||||
name: 'Empty Group'
|
|
||||||
description: 'Group with no members'
|
|
||||||
members: []GroupMember{}
|
members: []GroupMember{}
|
||||||
subgroups: []u32{}
|
subgroups: []u32{}
|
||||||
parent_group: 0
|
parent_group: 0
|
||||||
is_public: true
|
is_public: true
|
||||||
) or { panic('Failed to create empty group: ${err}') }
|
|
||||||
|
|
||||||
empty_group = mydb.group.set(empty_group) or { panic('Failed to save empty group: ${err}') }
|
|
||||||
|
|
||||||
retrieved_empty := mydb.group.get(empty_group.id) or {
|
|
||||||
panic('Failed to get empty group: ${err}')
|
|
||||||
}
|
|
||||||
assert retrieved_empty.members.len == 0
|
|
||||||
assert retrieved_empty.subgroups.len == 0
|
|
||||||
assert retrieved_empty.is_public == true
|
|
||||||
|
|
||||||
// Test group with many members
|
|
||||||
now := ourtime.now().unix()
|
|
||||||
mut many_members := []GroupMember{}
|
|
||||||
for i in 0 .. 100 {
|
|
||||||
many_members << GroupMember{
|
|
||||||
user_id: u32(i + 1)
|
|
||||||
role: if i % 4 == 0 {
|
|
||||||
.owner
|
|
||||||
} else if i % 4 == 1 {
|
|
||||||
.admin
|
|
||||||
} else if i % 4 == 2 {
|
|
||||||
.writer
|
|
||||||
} else {
|
|
||||||
.reader
|
|
||||||
}
|
|
||||||
joined_at: now + i64(i * 60) // Different join times
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
mut large_group := mydb.group.new(
|
mut parent_group := db_group.new(parent_args)!
|
||||||
name: 'Large Group'
|
parent_group = db_group.set(parent_group)!
|
||||||
description: 'Group with many members'
|
parent_id := parent_group.id
|
||||||
members: many_members
|
|
||||||
subgroups: []u32{len: 50, init: u32(index + 1000)} // 50 subgroups
|
// Create child group
|
||||||
parent_group: 999
|
mut child_args := GroupArg{
|
||||||
|
name: 'child_group'
|
||||||
|
description: 'Child group'
|
||||||
|
members: []GroupMember{}
|
||||||
|
subgroups: []u32{}
|
||||||
|
parent_group: parent_id
|
||||||
is_public: false
|
is_public: false
|
||||||
) or { panic('Failed to create large group: ${err}') }
|
|
||||||
|
|
||||||
large_group = mydb.group.set(large_group) or { panic('Failed to save large group: ${err}') }
|
|
||||||
|
|
||||||
retrieved_large := mydb.group.get(large_group.id) or {
|
|
||||||
panic('Failed to get large group: ${err}')
|
|
||||||
}
|
}
|
||||||
assert retrieved_large.members.len == 100
|
|
||||||
assert retrieved_large.subgroups.len == 50
|
|
||||||
assert retrieved_large.parent_group == 999
|
|
||||||
|
|
||||||
// Verify member roles are preserved
|
mut child_group := db_group.new(child_args)!
|
||||||
mut role_counts := map[GroupRole]int{}
|
child_group = db_group.set(child_group)!
|
||||||
for member in retrieved_large.members {
|
child_id := child_group.id
|
||||||
role_counts[member.role]++
|
|
||||||
|
// Add child to parent's subgroups
|
||||||
|
parent_group.subgroups << child_id
|
||||||
|
parent_group = db_group.set(parent_group)!
|
||||||
|
|
||||||
|
// Verify hierarchy
|
||||||
|
final_parent := db_group.get(parent_id)!
|
||||||
|
final_child := db_group.get(child_id)!
|
||||||
|
|
||||||
|
assert final_parent.subgroups.len == 1
|
||||||
|
assert final_parent.subgroups[0] == child_id
|
||||||
|
assert final_child.parent_group == parent_id
|
||||||
|
|
||||||
|
println('✓ Group hierarchy operations test passed!')
|
||||||
|
}
|
||||||
|
|
||||||
|
fn test_group_list_operations() ! {
|
||||||
|
// Initialize DBGroup for testing
|
||||||
|
mut mydb := db.new_test()!
|
||||||
|
mut db_group := DBGroup{
|
||||||
|
db: &mydb
|
||||||
}
|
}
|
||||||
assert role_counts[GroupRole.owner] == 25
|
|
||||||
assert role_counts[GroupRole.admin] == 25
|
|
||||||
assert role_counts[GroupRole.writer] == 25
|
|
||||||
assert role_counts[GroupRole.reader] == 25
|
|
||||||
|
|
||||||
// Test group with empty strings
|
// Create public group
|
||||||
mut minimal_group := mydb.group.new(
|
mut public_args := GroupArg{
|
||||||
name: ''
|
name: 'public_group'
|
||||||
description: ''
|
description: 'Public group'
|
||||||
|
members: []GroupMember{}
|
||||||
|
subgroups: []u32{}
|
||||||
|
parent_group: 0
|
||||||
|
is_public: true
|
||||||
|
}
|
||||||
|
|
||||||
|
mut public_group := db_group.new(public_args)!
|
||||||
|
public_group = db_group.set(public_group)!
|
||||||
|
|
||||||
|
// Create private group
|
||||||
|
mut private_args := GroupArg{
|
||||||
|
name: 'private_group'
|
||||||
|
description: 'Private group'
|
||||||
members: []GroupMember{}
|
members: []GroupMember{}
|
||||||
subgroups: []u32{}
|
subgroups: []u32{}
|
||||||
parent_group: 0
|
parent_group: 0
|
||||||
is_public: false
|
is_public: false
|
||||||
) or { panic('Failed to create minimal group: ${err}') }
|
|
||||||
|
|
||||||
minimal_group = mydb.group.set(minimal_group) or { panic('Failed to save minimal group: ${err}') }
|
|
||||||
|
|
||||||
retrieved_minimal := mydb.group.get(minimal_group.id) or {
|
|
||||||
panic('Failed to get minimal group: ${err}')
|
|
||||||
}
|
}
|
||||||
assert retrieved_minimal.name == ''
|
|
||||||
assert retrieved_minimal.description == ''
|
mut private_group := db_group.new(private_args)!
|
||||||
assert retrieved_minimal.members.len == 0
|
private_group = db_group.set(private_group)!
|
||||||
assert retrieved_minimal.is_public == false
|
|
||||||
|
// Create child group
|
||||||
|
mut child_args := GroupArg{
|
||||||
|
name: 'child_group'
|
||||||
|
description: 'Child group'
|
||||||
|
members: []GroupMember{}
|
||||||
|
subgroups: []u32{}
|
||||||
|
parent_group: public_group.id
|
||||||
|
is_public: true
|
||||||
|
}
|
||||||
|
|
||||||
|
mut child_group := db_group.new(child_args)!
|
||||||
|
child_group = db_group.set(child_group)!
|
||||||
|
|
||||||
|
// Add child to parent's subgroups
|
||||||
|
public_group.subgroups << child_group.id
|
||||||
|
public_group = db_group.set(public_group)!
|
||||||
|
|
||||||
|
// Test list with is_public filter
|
||||||
|
public_list_args := GroupListArg{
|
||||||
|
is_public: true
|
||||||
|
parent_group: 0
|
||||||
|
limit: 100
|
||||||
|
}
|
||||||
|
|
||||||
|
public_groups := db_group.list(public_list_args)!
|
||||||
|
assert public_groups.len >= 1
|
||||||
|
|
||||||
|
found_public := false
|
||||||
|
for group in public_groups {
|
||||||
|
if group.id == public_group.id {
|
||||||
|
found_public = true
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
assert found_public == true
|
||||||
|
|
||||||
|
// Test list with parent_group filter
|
||||||
|
parent_list_args := GroupListArg{
|
||||||
|
is_public: false
|
||||||
|
parent_group: public_group.id
|
||||||
|
limit: 100
|
||||||
|
}
|
||||||
|
|
||||||
|
parent_groups := db_group.list(parent_list_args)!
|
||||||
|
assert parent_groups.len >= 1
|
||||||
|
|
||||||
|
found_child := false
|
||||||
|
for group in parent_groups {
|
||||||
|
if group.id == child_group.id {
|
||||||
|
found_child = true
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
assert found_child == true
|
||||||
|
|
||||||
|
// Test list with both filters
|
||||||
|
both_list_args := GroupListArg{
|
||||||
|
is_public: true
|
||||||
|
parent_group: public_group.id
|
||||||
|
limit: 100
|
||||||
|
}
|
||||||
|
|
||||||
|
both_groups := db_group.list(both_list_args)!
|
||||||
|
assert both_groups.len >= 1
|
||||||
|
|
||||||
|
found_child_public := false
|
||||||
|
for group in both_groups {
|
||||||
|
if group.id == child_group.id && group.is_public == true {
|
||||||
|
found_child_public = true
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
assert found_child_public == true
|
||||||
|
|
||||||
|
println('✓ Group list operations test passed!')
|
||||||
}
|
}
|
||||||
|
|
||||||
|
fn test_group_type_name() ! {
|
||||||
|
// Initialize DBGroup for testing
|
||||||
|
mut mydb := db.new_test()!
|
||||||
|
mut db_group := DBGroup{
|
||||||
|
db: &mydb
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create a new group
|
||||||
|
mut args := GroupArg{
|
||||||
|
name: 'type_test_group'
|
||||||
|
description: 'Test group for type name'
|
||||||
|
members: []GroupMember{}
|
||||||
|
subgroups: []u32{}
|
||||||
|
parent_group: 0
|
||||||
|
is_public: true
|
||||||
|
}
|
||||||
|
|
||||||
|
group := db_group.new(args)!
|
||||||
|
|
||||||
|
// Test type_name method
|
||||||
|
type_name := group.type_name()
|
||||||
|
assert type_name == 'group'
|
||||||
|
|
||||||
|
println('✓ Group type_name test passed!')
|
||||||
|
}
|
||||||
|
|
||||||
|
fn test_group_description() ! {
|
||||||
|
// Initialize DBGroup for testing
|
||||||
|
mut mydb := db.new_test()!
|
||||||
|
mut db_group := DBGroup{
|
||||||
|
db: &mydb
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create a new group
|
||||||
|
mut args := GroupArg{
|
||||||
|
name: 'description_test_group'
|
||||||
|
description: 'Test group for description'
|
||||||
|
members: []GroupMember{}
|
||||||
|
subgroups: []u32{}
|
||||||
|
parent_group: 0
|
||||||
|
is_public: true
|
||||||
|
}
|
||||||
|
|
||||||
|
group := db_group.new(args)!
|
||||||
|
|
||||||
|
// Test description method for each methodname
|
||||||
|
assert group.description('set') == 'Create or update a group. Returns the ID of the group.'
|
||||||
|
assert group.description('get') == 'Retrieve a group by ID. Returns the group object.'
|
||||||
|
assert group.description('delete') == 'Delete a group by ID. Returns true if successful.'
|
||||||
|
assert group.description('exist') == 'Check if a group exists by ID. Returns true or false.'
|
||||||
|
assert group.description('list') == 'List all groups. Returns an array of group objects.'
|
||||||
|
assert group.description('unknown') == 'This is generic method for the root object, TODO fill in, ...'
|
||||||
|
|
||||||
|
println('✓ Group description test passed!')
|
||||||
|
}
|
||||||
|
|
||||||
|
fn test_group_example() ! {
|
||||||
|
// Initialize DBGroup for testing
|
||||||
|
mut mydb := db.new_test()!
|
||||||
|
mut db_group := DBGroup{
|
||||||
|
db: &mydb
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create a new group
|
||||||
|
mut args := GroupArg{
|
||||||
|
name: 'example_test_group'
|
||||||
|
description: 'Test group for example'
|
||||||
|
members: []GroupMember{}
|
||||||
|
subgroups: []u32{}
|
||||||
|
parent_group: 0
|
||||||
|
is_public: true
|
||||||
|
}
|
||||||
|
|
||||||
|
group := db_group.new(args)!
|
||||||
|
|
||||||
|
// Test example method for each methodname
|
||||||
|
set_call, set_result := group.example('set')
|
||||||
|
assert set_call == '{"group": {"name": "Admins", "description": "Administrators group", "members": [], "subgroups": [], "parent_group": 0, "is_public": false}}'
|
||||||
|
assert set_result == '1'
|
||||||
|
|
||||||
|
get_call, get_result := group.example('get')
|
||||||
|
assert get_call == '{"id": 1}'
|
||||||
|
assert get_result == '{"name": "Admins", "description": "Administrators group", "members": [], "subgroups": [], "parent_group": 0, "is_public": false}'
|
||||||
|
|
||||||
|
delete_call, delete_result := group.example('delete')
|
||||||
|
assert delete_call == '{"id": 1}'
|
||||||
|
assert delete_result == 'true'
|
||||||
|
|
||||||
|
exist_call, exist_result := group.example('exist')
|
||||||
|
assert exist_call == '{"id": 1}'
|
||||||
|
assert exist_result == 'true'
|
||||||
|
|
||||||
|
list_call, list_result := group.example('list')
|
||||||
|
assert list_call == '{}'
|
||||||
|
assert list_result == '[{"name": "Admins", "description": "Administrators group", "members": [], "subgroups": [], "parent_group": 0, "is_public": false}]'
|
||||||
|
|
||||||
|
unknown_call, unknown_result := group.example('unknown')
|
||||||
|
assert unknown_call == '{}'
|
||||||
|
assert unknown_result == '{}'
|
||||||
|
|
||||||
|
println('✓ Group example test passed!')
|
||||||
|
}
|
||||||
|
|
||||||
|
fn test_group_encoding_decoding() ! {
|
||||||
|
// Initialize DBGroup for testing
|
||||||
|
mut mydb := db.new_test()!
|
||||||
|
mut db_group := DBGroup{
|
||||||
|
db: &mydb
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create a new group with members
|
||||||
|
mut args := GroupArg{
|
||||||
|
name: 'encoding_test_group'
|
||||||
|
description: 'Test group for encoding/decoding'
|
||||||
|
members: []GroupMember{}
|
||||||
|
subgroups: [u32(10), u32(20), u32(30)]
|
||||||
|
parent_group: 5
|
||||||
|
is_public: true
|
||||||
|
}
|
||||||
|
|
||||||
|
mut group := db_group.new(args)!
|
||||||
|
|
||||||
|
// Add some members
|
||||||
|
group.add_member(100, .admin)
|
||||||
|
group.add_member(101, .writer)
|
||||||
|
|
||||||
|
// Save the group
|
||||||
|
group = db_group.set(group)!
|
||||||
|
group_id := group.id
|
||||||
|
|
||||||
|
// Retrieve and verify all fields were properly encoded/decoded
|
||||||
|
retrieved_group := db_group.get(group_id)!
|
||||||
|
|
||||||
|
assert retrieved_group.name == 'encoding_test_group'
|
||||||
|
assert retrieved_group.description == 'Test group for encoding/decoding'
|
||||||
|
assert retrieved_group.subgroups == [u32(10), u32(20), u32(30)]
|
||||||
|
assert retrieved_group.parent_group == 5
|
||||||
|
assert retrieved_group.is_public == true
|
||||||
|
assert retrieved_group.members.len == 2
|
||||||
|
|
||||||
|
// Verify member details
|
||||||
|
assert retrieved_group.members[0].user_id == 100
|
||||||
|
assert retrieved_group.members[0].role == .admin
|
||||||
|
assert retrieved_group.members[1].user_id == 101
|
||||||
|
assert retrieved_group.members[1].role == .writer
|
||||||
|
|
||||||
|
println('✓ Group encoding/decoding test passed!')
|
||||||
|
}
|
||||||
@@ -48,6 +48,13 @@ pub mut:
|
|||||||
db &db.DB @[skip; str: skip]
|
db &db.DB @[skip; str: skip]
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@[params]
|
||||||
|
pub struct ProjectListArg {
|
||||||
|
pub mut:
|
||||||
|
status ProjectStatus
|
||||||
|
limit int = 100 // Default limit is 100
|
||||||
|
}
|
||||||
|
|
||||||
pub fn (self Project) type_name() string {
|
pub fn (self Project) type_name() string {
|
||||||
return 'project'
|
return 'project'
|
||||||
}
|
}
|
||||||
@@ -100,7 +107,7 @@ pub fn (self Project) example(methodname string) (string, string) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn (self Project) dump(mut e encoder.Encoder) ! {
|
fn (self Project) dump(mut e encoder.Encoder) ! {
|
||||||
e.add_u16(u16(self.swimlanes.len))
|
e.add_u16(u16(self.swimlanes.len))
|
||||||
for swimlane in self.swimlanes {
|
for swimlane in self.swimlanes {
|
||||||
e.add_string(swimlane.name)
|
e.add_string(swimlane.name)
|
||||||
@@ -233,6 +240,34 @@ pub fn (mut self DBProject) get(id u32) !Project {
|
|||||||
return o
|
return o
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn (mut self DBProject) list() ![]Project {
|
pub fn (mut self DBProject) list(args ProjectListArg) ![]Project {
|
||||||
return self.db.list[Project]()!.map(self.get(it)!)
|
// Require at least one parameter to be provided
|
||||||
|
if args.status == .planning {
|
||||||
|
return error('At least one filter parameter must be provided')
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get all projects from the database
|
||||||
|
all_projects := self.db.list[Project]()!.map(self.get(it)!)
|
||||||
|
|
||||||
|
// Apply filters
|
||||||
|
mut filtered_projects := []Project{}
|
||||||
|
for project in all_projects {
|
||||||
|
// Filter by status if provided (status is not planning)
|
||||||
|
if args.status != .planning && project.status != args.status {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
filtered_projects << project
|
||||||
|
}
|
||||||
|
|
||||||
|
// Limit results to 100 or the specified limit
|
||||||
|
limit := args.limit
|
||||||
|
if limit > 100 {
|
||||||
|
limit = 100
|
||||||
|
}
|
||||||
|
if filtered_projects.len > limit {
|
||||||
|
return filtered_projects[..limit]
|
||||||
|
}
|
||||||
|
|
||||||
|
return filtered_projects
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -58,6 +58,17 @@ pub mut:
|
|||||||
db &db.DB @[skip; str: skip]
|
db &db.DB @[skip; str: skip]
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@[params]
|
||||||
|
pub struct ProjectIssueListArg {
|
||||||
|
pub mut:
|
||||||
|
project_id u32
|
||||||
|
issue_type IssueType
|
||||||
|
status IssueStatus
|
||||||
|
swimlane string
|
||||||
|
milestone string
|
||||||
|
limit int = 100 // Default limit is 100
|
||||||
|
}
|
||||||
|
|
||||||
pub fn (self ProjectIssue) type_name() string {
|
pub fn (self ProjectIssue) type_name() string {
|
||||||
return 'project_issue'
|
return 'project_issue'
|
||||||
}
|
}
|
||||||
@@ -110,7 +121,7 @@ pub fn (self ProjectIssue) example(methodname string) (string, string) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn (self ProjectIssue) dump(mut e encoder.Encoder) ! {
|
fn (self ProjectIssue) dump(mut e encoder.Encoder) ! {
|
||||||
e.add_string(self.title)
|
e.add_string(self.title)
|
||||||
e.add_u32(self.project_id)
|
e.add_u32(self.project_id)
|
||||||
e.add_u8(u8(self.issue_type))
|
e.add_u8(u8(self.issue_type))
|
||||||
@@ -255,6 +266,54 @@ pub fn (mut self DBProjectIssue) get(id u32) !ProjectIssue {
|
|||||||
return o
|
return o
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn (mut self DBProjectIssue) list() ![]ProjectIssue {
|
pub fn (mut self DBProjectIssue) list(args ProjectIssueListArg) ![]ProjectIssue {
|
||||||
return self.db.list[ProjectIssue]()!.map(self.get(it)!)
|
// Require at least one parameter to be provided
|
||||||
|
if args.project_id == 0 && args.issue_type == .task && args.status == .open && args.swimlane == '' && args.milestone == '' {
|
||||||
|
return error('At least one filter parameter must be provided')
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get all project issues from the database
|
||||||
|
all_project_issues := self.db.list[ProjectIssue]()!.map(self.get(it)!)
|
||||||
|
|
||||||
|
// Apply filters
|
||||||
|
mut filtered_project_issues := []ProjectIssue{}
|
||||||
|
for project_issue in all_project_issues {
|
||||||
|
// Filter by project_id if provided
|
||||||
|
if args.project_id != 0 && project_issue.project_id != args.project_id {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
// Filter by issue_type if provided (issue_type is not task)
|
||||||
|
if args.issue_type != .task && project_issue.issue_type != args.issue_type {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
// Filter by status if provided (status is not open)
|
||||||
|
if args.status != .open && project_issue.status != args.status {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
// Filter by swimlane if provided
|
||||||
|
if args.swimlane != '' && project_issue.swimlane != args.swimlane {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
// Filter by milestone if provided
|
||||||
|
if args.milestone != '' && project_issue.milestone != args.milestone {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
filtered_project_issues << project_issue
|
||||||
|
}
|
||||||
|
|
||||||
|
// Limit results to 100 or the specified limit
|
||||||
|
limit := args.limit
|
||||||
|
if limit > 100 {
|
||||||
|
limit = 100
|
||||||
|
}
|
||||||
|
if filtered_project_issues.len > limit {
|
||||||
|
return filtered_project_issues[..limit]
|
||||||
|
}
|
||||||
|
|
||||||
|
return filtered_project_issues
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,730 +0,0 @@
|
|||||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused test
|
|
||||||
|
|
||||||
module heromodels
|
|
||||||
|
|
||||||
import freeflowuniverse.herolib.hero.heromodels { IssuePriority, IssueType, Milestone, Swimlane }
|
|
||||||
import freeflowuniverse.herolib.data.ourtime
|
|
||||||
|
|
||||||
// Test ProjectIssue model CRUD operations
|
|
||||||
fn test_project_issue_new() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create a project first with default swimlanes
|
|
||||||
mut project := mydb.project.new(
|
|
||||||
name: 'Test Project'
|
|
||||||
description: 'Test project for issues'
|
|
||||||
start_date: ourtime.now().str()
|
|
||||||
end_date: ourtime.new_from_epoch(u64(ourtime.now().unix() + 86400 * 90)).str()
|
|
||||||
swimlanes: [
|
|
||||||
Swimlane{
|
|
||||||
name: 'backlog'
|
|
||||||
description: 'Backlog items'
|
|
||||||
order: 1
|
|
||||||
color: '#cccccc'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
Swimlane{
|
|
||||||
name: 'todo'
|
|
||||||
description: 'To do items'
|
|
||||||
order: 2
|
|
||||||
color: '#ffcccc'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
Swimlane{
|
|
||||||
name: 'development'
|
|
||||||
description: 'In development'
|
|
||||||
order: 3
|
|
||||||
color: '#ccffcc'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
Swimlane{
|
|
||||||
name: 'test'
|
|
||||||
description: 'Testing'
|
|
||||||
order: 4
|
|
||||||
color: '#ccccff'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
Swimlane{
|
|
||||||
name: 'completed'
|
|
||||||
description: 'Completed'
|
|
||||||
order: 5
|
|
||||||
color: '#ccffff'
|
|
||||||
is_done: true
|
|
||||||
},
|
|
||||||
]
|
|
||||||
milestones: [
|
|
||||||
Milestone{
|
|
||||||
name: 'v1_release'
|
|
||||||
description: 'Version 1.0 release'
|
|
||||||
due_date: ourtime.now().unix() + 86400 * 30
|
|
||||||
completed: false
|
|
||||||
issues: []
|
|
||||||
},
|
|
||||||
Milestone{
|
|
||||||
name: 'hotfix_release'
|
|
||||||
description: 'Hotfix release'
|
|
||||||
due_date: ourtime.now().unix() + 86400 * 7
|
|
||||||
completed: false
|
|
||||||
issues: []
|
|
||||||
},
|
|
||||||
Milestone{
|
|
||||||
name: 'v2_release'
|
|
||||||
description: 'Version 2.0 release'
|
|
||||||
due_date: ourtime.now().unix() + 86400 * 90
|
|
||||||
completed: false
|
|
||||||
issues: []
|
|
||||||
},
|
|
||||||
Milestone{
|
|
||||||
name: 'original_milestone'
|
|
||||||
description: 'Original milestone'
|
|
||||||
due_date: ourtime.now().unix() + 86400 * 60
|
|
||||||
completed: false
|
|
||||||
issues: []
|
|
||||||
},
|
|
||||||
]
|
|
||||||
) or { panic('Failed to create project: ${err}') }
|
|
||||||
project = mydb.project.set(project) or { panic('Failed to save project: ${err}') }
|
|
||||||
project_id := project.id
|
|
||||||
|
|
||||||
// Test creating a new project issue with all fields
|
|
||||||
now := ourtime.now().unix()
|
|
||||||
mut issue := mydb.project_issue.new(
|
|
||||||
name: 'PROJ-123'
|
|
||||||
description: 'Implement user authentication system'
|
|
||||||
title: 'User Authentication Feature'
|
|
||||||
project_id: project_id
|
|
||||||
issue_type: .story
|
|
||||||
priority: .high
|
|
||||||
status: .open
|
|
||||||
swimlane: 'backlog'
|
|
||||||
assignees: [u32(10), 20, 30]
|
|
||||||
reporter: 5
|
|
||||||
milestone: 'v1_release'
|
|
||||||
deadline: ourtime.new_from_epoch(u64(now + 86400 * 14)).str()
|
|
||||||
estimate: 8 // 8 story points
|
|
||||||
fs_files: [u32(100), 200]
|
|
||||||
parent_id: 0
|
|
||||||
children: []u32{}
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['authentication', 'security', 'backend']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create project issue: ${err}') }
|
|
||||||
|
|
||||||
// Verify the issue was created with correct values
|
|
||||||
assert issue.name == 'PROJ-123'
|
|
||||||
assert issue.description == 'Implement user authentication system'
|
|
||||||
assert issue.title == 'User Authentication Feature'
|
|
||||||
assert issue.project_id == project_id
|
|
||||||
assert issue.issue_type == .story
|
|
||||||
assert issue.priority == .high
|
|
||||||
assert issue.status == .open
|
|
||||||
assert issue.swimlane == 'backlog'
|
|
||||||
assert issue.assignees.len == 3
|
|
||||||
assert issue.assignees[0] == 10
|
|
||||||
assert issue.reporter == 5
|
|
||||||
assert issue.milestone == 'v1_release'
|
|
||||||
// Allow for small timing differences (within 60 seconds)
|
|
||||||
expected_deadline := now + 86400 * 14
|
|
||||||
assert issue.deadline >= expected_deadline - 60 && issue.deadline <= expected_deadline + 60
|
|
||||||
assert issue.estimate == 8
|
|
||||||
assert issue.fs_files.len == 2
|
|
||||||
assert issue.parent_id == 0
|
|
||||||
assert issue.children.len == 0
|
|
||||||
assert issue.id == 0 // Should be 0 before saving
|
|
||||||
assert issue.updated_at > 0 // Should have timestamp
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_project_issue_set_and_get() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create a project first with default swimlanes
|
|
||||||
mut project := mydb.project.new(
|
|
||||||
name: 'Test Project'
|
|
||||||
description: 'Test project for issues'
|
|
||||||
start_date: ourtime.now().str()
|
|
||||||
end_date: ourtime.new_from_epoch(u64(ourtime.now().unix() + 86400 * 90)).str()
|
|
||||||
swimlanes: [
|
|
||||||
Swimlane{
|
|
||||||
name: 'backlog'
|
|
||||||
description: 'Backlog items'
|
|
||||||
order: 1
|
|
||||||
color: '#cccccc'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
Swimlane{
|
|
||||||
name: 'todo'
|
|
||||||
description: 'To do items'
|
|
||||||
order: 2
|
|
||||||
color: '#ffcccc'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
Swimlane{
|
|
||||||
name: 'development'
|
|
||||||
description: 'In development'
|
|
||||||
order: 3
|
|
||||||
color: '#ccffcc'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
Swimlane{
|
|
||||||
name: 'test'
|
|
||||||
description: 'Testing'
|
|
||||||
order: 4
|
|
||||||
color: '#ccccff'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
Swimlane{
|
|
||||||
name: 'completed'
|
|
||||||
description: 'Completed'
|
|
||||||
order: 5
|
|
||||||
color: '#ccffff'
|
|
||||||
is_done: true
|
|
||||||
},
|
|
||||||
]
|
|
||||||
milestones: [
|
|
||||||
Milestone{
|
|
||||||
name: 'v1_release'
|
|
||||||
description: 'Version 1.0 release'
|
|
||||||
due_date: ourtime.now().unix() + 86400 * 30
|
|
||||||
completed: false
|
|
||||||
issues: []
|
|
||||||
},
|
|
||||||
Milestone{
|
|
||||||
name: 'hotfix_release'
|
|
||||||
description: 'Hotfix release'
|
|
||||||
due_date: ourtime.now().unix() + 86400 * 7
|
|
||||||
completed: false
|
|
||||||
issues: []
|
|
||||||
},
|
|
||||||
Milestone{
|
|
||||||
name: 'v2_release'
|
|
||||||
description: 'Version 2.0 release'
|
|
||||||
due_date: ourtime.now().unix() + 86400 * 90
|
|
||||||
completed: false
|
|
||||||
issues: []
|
|
||||||
},
|
|
||||||
Milestone{
|
|
||||||
name: 'original_milestone'
|
|
||||||
description: 'Original milestone'
|
|
||||||
due_date: ourtime.now().unix() + 86400 * 60
|
|
||||||
completed: false
|
|
||||||
issues: []
|
|
||||||
},
|
|
||||||
]
|
|
||||||
) or { panic('Failed to create project: ${err}') }
|
|
||||||
project = mydb.project.set(project) or { panic('Failed to save project: ${err}') }
|
|
||||||
project_id := project.id
|
|
||||||
|
|
||||||
// Create a project issue
|
|
||||||
now := ourtime.now().unix()
|
|
||||||
mut issue := mydb.project_issue.new(
|
|
||||||
name: 'BUG-456'
|
|
||||||
description: 'Fix login page CSS styling issues'
|
|
||||||
title: 'Login Page Styling Bug'
|
|
||||||
project_id: project_id
|
|
||||||
issue_type: .bug
|
|
||||||
priority: .medium
|
|
||||||
status: .in_progress
|
|
||||||
swimlane: 'development'
|
|
||||||
assignees: [u32(15)]
|
|
||||||
reporter: 8
|
|
||||||
milestone: 'hotfix_release'
|
|
||||||
deadline: ourtime.new_from_epoch(u64(now + 86400 * 3)).str() // 3 days from now
|
|
||||||
estimate: 2 // 2 story points
|
|
||||||
fs_files: [u32(300)]
|
|
||||||
parent_id: 0
|
|
||||||
children: []u32{}
|
|
||||||
securitypolicy: 2
|
|
||||||
tags: ['bug', 'frontend', 'css']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create project issue: ${err}') }
|
|
||||||
|
|
||||||
// Save the issue
|
|
||||||
issue = mydb.project_issue.set(issue) or { panic('Failed to save project issue: ${err}') }
|
|
||||||
|
|
||||||
// Verify ID was assigned
|
|
||||||
assert issue.id > 0
|
|
||||||
original_id := issue.id
|
|
||||||
|
|
||||||
// Retrieve the issue
|
|
||||||
retrieved_issue := mydb.project_issue.get(issue.id) or {
|
|
||||||
panic('Failed to get project issue: ${err}')
|
|
||||||
}
|
|
||||||
|
|
||||||
// Verify all fields match
|
|
||||||
assert retrieved_issue.id == original_id
|
|
||||||
assert retrieved_issue.name == 'BUG-456'
|
|
||||||
assert retrieved_issue.description == 'Fix login page CSS styling issues'
|
|
||||||
assert retrieved_issue.title == 'Login Page Styling Bug'
|
|
||||||
assert retrieved_issue.project_id == project_id
|
|
||||||
assert retrieved_issue.issue_type == .bug
|
|
||||||
assert retrieved_issue.priority == .medium
|
|
||||||
assert retrieved_issue.status == .in_progress
|
|
||||||
assert retrieved_issue.swimlane == 'development'
|
|
||||||
assert retrieved_issue.assignees.len == 1
|
|
||||||
assert retrieved_issue.assignees[0] == 15
|
|
||||||
assert retrieved_issue.reporter == 8
|
|
||||||
assert retrieved_issue.milestone == 'hotfix_release'
|
|
||||||
// Allow for small timing differences (within 60 seconds)
|
|
||||||
expected_deadline_2 := now + 86400 * 3
|
|
||||||
assert retrieved_issue.deadline >= expected_deadline_2 - 60
|
|
||||||
&& retrieved_issue.deadline <= expected_deadline_2 + 60
|
|
||||||
assert retrieved_issue.estimate == 2
|
|
||||||
assert retrieved_issue.fs_files.len == 1
|
|
||||||
assert retrieved_issue.fs_files[0] == 300
|
|
||||||
assert retrieved_issue.parent_id == 0
|
|
||||||
assert retrieved_issue.children.len == 0
|
|
||||||
assert retrieved_issue.created_at > 0
|
|
||||||
assert retrieved_issue.updated_at > 0
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_project_issue_types_and_priorities() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create a project first with default swimlanes
|
|
||||||
mut project := mydb.project.new(
|
|
||||||
name: 'Test Project'
|
|
||||||
description: 'Test project for issues'
|
|
||||||
start_date: ourtime.now().str()
|
|
||||||
end_date: ourtime.new_from_epoch(u64(ourtime.now().unix() + 86400 * 90)).str()
|
|
||||||
swimlanes: [
|
|
||||||
Swimlane{
|
|
||||||
name: 'backlog'
|
|
||||||
description: 'Backlog items'
|
|
||||||
order: 1
|
|
||||||
color: '#cccccc'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
Swimlane{
|
|
||||||
name: 'todo'
|
|
||||||
description: 'To do items'
|
|
||||||
order: 2
|
|
||||||
color: '#ffcccc'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
Swimlane{
|
|
||||||
name: 'development'
|
|
||||||
description: 'In development'
|
|
||||||
order: 3
|
|
||||||
color: '#ccffcc'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
Swimlane{
|
|
||||||
name: 'test'
|
|
||||||
description: 'Testing'
|
|
||||||
order: 4
|
|
||||||
color: '#ccccff'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
Swimlane{
|
|
||||||
name: 'completed'
|
|
||||||
description: 'Completed'
|
|
||||||
order: 5
|
|
||||||
color: '#ccffff'
|
|
||||||
is_done: true
|
|
||||||
},
|
|
||||||
]
|
|
||||||
milestones: [
|
|
||||||
Milestone{
|
|
||||||
name: 'v1_release'
|
|
||||||
description: 'Version 1.0 release'
|
|
||||||
due_date: ourtime.now().unix() + 86400 * 30
|
|
||||||
completed: false
|
|
||||||
issues: []
|
|
||||||
},
|
|
||||||
Milestone{
|
|
||||||
name: 'hotfix_release'
|
|
||||||
description: 'Hotfix release'
|
|
||||||
due_date: ourtime.now().unix() + 86400 * 7
|
|
||||||
completed: false
|
|
||||||
issues: []
|
|
||||||
},
|
|
||||||
Milestone{
|
|
||||||
name: 'v2_release'
|
|
||||||
description: 'Version 2.0 release'
|
|
||||||
due_date: ourtime.now().unix() + 86400 * 90
|
|
||||||
completed: false
|
|
||||||
issues: []
|
|
||||||
},
|
|
||||||
Milestone{
|
|
||||||
name: 'original_milestone'
|
|
||||||
description: 'Original milestone'
|
|
||||||
due_date: ourtime.now().unix() + 86400 * 60
|
|
||||||
completed: false
|
|
||||||
issues: []
|
|
||||||
},
|
|
||||||
]
|
|
||||||
) or { panic('Failed to create project: ${err}') }
|
|
||||||
project = mydb.project.set(project) or { panic('Failed to save project: ${err}') }
|
|
||||||
project_id := project.id
|
|
||||||
|
|
||||||
// Test all issue types
|
|
||||||
issue_types := [IssueType.task, .story, .bug, .question, .epic, .subtask]
|
|
||||||
priorities := [IssuePriority.lowest, .low, .medium, .high, .highest, .critical]
|
|
||||||
|
|
||||||
for i, issue_type in issue_types {
|
|
||||||
priority := priorities[i % priorities.len]
|
|
||||||
|
|
||||||
mut issue := mydb.project_issue.new(
|
|
||||||
name: 'TEST-${i}'
|
|
||||||
description: 'Testing ${issue_type} with ${priority} priority'
|
|
||||||
title: 'Test Issue ${i}'
|
|
||||||
project_id: project_id
|
|
||||||
issue_type: IssueType(issue_type)
|
|
||||||
priority: IssuePriority(priority)
|
|
||||||
status: .open
|
|
||||||
swimlane: 'development'
|
|
||||||
assignees: []u32{}
|
|
||||||
reporter: 1
|
|
||||||
milestone: 'v1_release'
|
|
||||||
deadline: ''
|
|
||||||
estimate: 1
|
|
||||||
fs_files: []u32{}
|
|
||||||
parent_id: 0
|
|
||||||
children: []u32{}
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['test']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create issue with type ${issue_type}: ${err}') }
|
|
||||||
|
|
||||||
issue = mydb.project_issue.set(issue) or {
|
|
||||||
panic('Failed to save issue with type ${issue_type}: ${err}')
|
|
||||||
}
|
|
||||||
|
|
||||||
retrieved_issue := mydb.project_issue.get(issue.id) or {
|
|
||||||
panic('Failed to get issue with type ${issue_type}: ${err}')
|
|
||||||
}
|
|
||||||
assert retrieved_issue.issue_type == IssueType(issue_type)
|
|
||||||
assert retrieved_issue.priority == IssuePriority(priority)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_project_issue_parent_child_relationship() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create a project first with default swimlanes
|
|
||||||
mut project := mydb.project.new(
|
|
||||||
name: 'Test Project'
|
|
||||||
description: 'Test project for issues'
|
|
||||||
start_date: ourtime.now().str()
|
|
||||||
end_date: ourtime.new_from_epoch(u64(ourtime.now().unix() + 86400 * 90)).str()
|
|
||||||
swimlanes: [
|
|
||||||
Swimlane{
|
|
||||||
name: 'backlog'
|
|
||||||
description: 'Backlog items'
|
|
||||||
order: 1
|
|
||||||
color: '#cccccc'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
Swimlane{
|
|
||||||
name: 'todo'
|
|
||||||
description: 'To do items'
|
|
||||||
order: 2
|
|
||||||
color: '#ffcccc'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
Swimlane{
|
|
||||||
name: 'development'
|
|
||||||
description: 'In development'
|
|
||||||
order: 3
|
|
||||||
color: '#ccffcc'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
Swimlane{
|
|
||||||
name: 'test'
|
|
||||||
description: 'Testing'
|
|
||||||
order: 4
|
|
||||||
color: '#ccccff'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
Swimlane{
|
|
||||||
name: 'completed'
|
|
||||||
description: 'Completed'
|
|
||||||
order: 5
|
|
||||||
color: '#ccffff'
|
|
||||||
is_done: true
|
|
||||||
},
|
|
||||||
]
|
|
||||||
milestones: [
|
|
||||||
Milestone{
|
|
||||||
name: 'v1_release'
|
|
||||||
description: 'Version 1.0 release'
|
|
||||||
due_date: ourtime.now().unix() + 86400 * 30
|
|
||||||
completed: false
|
|
||||||
issues: []
|
|
||||||
},
|
|
||||||
Milestone{
|
|
||||||
name: 'hotfix_release'
|
|
||||||
description: 'Hotfix release'
|
|
||||||
due_date: ourtime.now().unix() + 86400 * 7
|
|
||||||
completed: false
|
|
||||||
issues: []
|
|
||||||
},
|
|
||||||
Milestone{
|
|
||||||
name: 'v2_release'
|
|
||||||
description: 'Version 2.0 release'
|
|
||||||
due_date: ourtime.now().unix() + 86400 * 90
|
|
||||||
completed: false
|
|
||||||
issues: []
|
|
||||||
},
|
|
||||||
Milestone{
|
|
||||||
name: 'original_milestone'
|
|
||||||
description: 'Original milestone'
|
|
||||||
due_date: ourtime.now().unix() + 86400 * 60
|
|
||||||
completed: false
|
|
||||||
issues: []
|
|
||||||
},
|
|
||||||
]
|
|
||||||
) or { panic('Failed to create project: ${err}') }
|
|
||||||
project = mydb.project.set(project) or { panic('Failed to save project: ${err}') }
|
|
||||||
project_id := project.id
|
|
||||||
|
|
||||||
// Create parent epic
|
|
||||||
mut parent_epic := mydb.project_issue.new(
|
|
||||||
name: 'EPIC-001'
|
|
||||||
description: 'User management epic'
|
|
||||||
title: 'User Management System'
|
|
||||||
project_id: project_id
|
|
||||||
issue_type: .epic
|
|
||||||
priority: .high
|
|
||||||
status: .in_progress
|
|
||||||
swimlane: 'development'
|
|
||||||
assignees: [u32(1)]
|
|
||||||
reporter: 1
|
|
||||||
milestone: 'v2_release'
|
|
||||||
deadline: ourtime.new_from_epoch(u64(ourtime.now().unix() + 86400 * 60)).str()
|
|
||||||
estimate: 50
|
|
||||||
fs_files: []u32{}
|
|
||||||
parent_id: 0
|
|
||||||
children: []u32{}
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['epic', 'user-management']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create parent epic: ${err}') }
|
|
||||||
|
|
||||||
parent_epic = mydb.project_issue.set(parent_epic) or { panic('Failed to save parent epic: ${err}') }
|
|
||||||
parent_id := parent_epic.id
|
|
||||||
|
|
||||||
// Create child subtasks
|
|
||||||
mut subtask1 := mydb.project_issue.new(
|
|
||||||
name: 'TASK-001'
|
|
||||||
description: 'Create user registration form'
|
|
||||||
title: 'User Registration Form'
|
|
||||||
project_id: project_id
|
|
||||||
issue_type: .subtask
|
|
||||||
priority: .medium
|
|
||||||
status: .open
|
|
||||||
swimlane: 'development'
|
|
||||||
assignees: [u32(2)]
|
|
||||||
reporter: 1
|
|
||||||
milestone: 'v2_release'
|
|
||||||
deadline: ourtime.new_from_epoch(u64(ourtime.now().unix() + 86400 * 20)).str()
|
|
||||||
estimate: 5
|
|
||||||
fs_files: []u32{}
|
|
||||||
parent_id: parent_id
|
|
||||||
children: []u32{}
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['subtask', 'frontend']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create subtask1: ${err}') }
|
|
||||||
|
|
||||||
mut subtask2 := mydb.project_issue.new(
|
|
||||||
name: 'TASK-002'
|
|
||||||
description: 'Implement user authentication API'
|
|
||||||
title: 'User Authentication API'
|
|
||||||
project_id: project_id
|
|
||||||
issue_type: .subtask
|
|
||||||
priority: .high
|
|
||||||
status: .open
|
|
||||||
swimlane: 'development'
|
|
||||||
assignees: [u32(3)]
|
|
||||||
reporter: 1
|
|
||||||
milestone: 'v2_release'
|
|
||||||
deadline: ourtime.new_from_epoch(u64(ourtime.now().unix() + 86400 * 25)).str()
|
|
||||||
estimate: 8
|
|
||||||
fs_files: []u32{}
|
|
||||||
parent_id: parent_id
|
|
||||||
children: []u32{}
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['subtask', 'backend', 'api']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create subtask2: ${err}') }
|
|
||||||
|
|
||||||
subtask1 = mydb.project_issue.set(subtask1) or { panic('Failed to save subtask1: ${err}') }
|
|
||||||
subtask2 = mydb.project_issue.set(subtask2) or { panic('Failed to save subtask2: ${err}') }
|
|
||||||
|
|
||||||
// Update parent with children
|
|
||||||
parent_epic.children = [subtask1.id, subtask2.id]
|
|
||||||
parent_epic = mydb.project_issue.set(parent_epic) or { panic('Failed to update parent epic: ${err}') }
|
|
||||||
|
|
||||||
// Verify relationships
|
|
||||||
retrieved_parent := mydb.project_issue.get(parent_id) or {
|
|
||||||
panic('Failed to get parent epic: ${err}')
|
|
||||||
}
|
|
||||||
retrieved_subtask1 := mydb.project_issue.get(subtask1.id) or {
|
|
||||||
panic('Failed to get subtask1: ${err}')
|
|
||||||
}
|
|
||||||
retrieved_subtask2 := mydb.project_issue.get(subtask2.id) or {
|
|
||||||
panic('Failed to get subtask2: ${err}')
|
|
||||||
}
|
|
||||||
|
|
||||||
assert retrieved_parent.children.len == 2
|
|
||||||
assert retrieved_parent.children.contains(subtask1.id)
|
|
||||||
assert retrieved_parent.children.contains(subtask2.id)
|
|
||||||
assert retrieved_subtask1.parent_id == parent_id
|
|
||||||
assert retrieved_subtask2.parent_id == parent_id
|
|
||||||
assert retrieved_subtask1.issue_type == .subtask
|
|
||||||
assert retrieved_subtask2.issue_type == .subtask
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_project_issue_update() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create a project first with default swimlanes
|
|
||||||
mut project := mydb.project.new(
|
|
||||||
name: 'Test Project'
|
|
||||||
description: 'Test project for issues'
|
|
||||||
start_date: ourtime.now().str()
|
|
||||||
end_date: ourtime.new_from_epoch(u64(ourtime.now().unix() + 86400 * 90)).str()
|
|
||||||
swimlanes: [
|
|
||||||
Swimlane{
|
|
||||||
name: 'backlog'
|
|
||||||
description: 'Backlog items'
|
|
||||||
order: 1
|
|
||||||
color: '#cccccc'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
Swimlane{
|
|
||||||
name: 'todo'
|
|
||||||
description: 'To do items'
|
|
||||||
order: 2
|
|
||||||
color: '#ffcccc'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
Swimlane{
|
|
||||||
name: 'development'
|
|
||||||
description: 'In development'
|
|
||||||
order: 3
|
|
||||||
color: '#ccffcc'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
Swimlane{
|
|
||||||
name: 'test'
|
|
||||||
description: 'Testing'
|
|
||||||
order: 4
|
|
||||||
color: '#ccccff'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
Swimlane{
|
|
||||||
name: 'completed'
|
|
||||||
description: 'Completed'
|
|
||||||
order: 5
|
|
||||||
color: '#ccffff'
|
|
||||||
is_done: true
|
|
||||||
},
|
|
||||||
]
|
|
||||||
milestones: [
|
|
||||||
Milestone{
|
|
||||||
name: 'v1_release'
|
|
||||||
description: 'Version 1.0 release'
|
|
||||||
due_date: ourtime.now().unix() + 86400 * 30
|
|
||||||
completed: false
|
|
||||||
issues: []
|
|
||||||
},
|
|
||||||
Milestone{
|
|
||||||
name: 'hotfix_release'
|
|
||||||
description: 'Hotfix release'
|
|
||||||
due_date: ourtime.now().unix() + 86400 * 7
|
|
||||||
completed: false
|
|
||||||
issues: []
|
|
||||||
},
|
|
||||||
Milestone{
|
|
||||||
name: 'v2_release'
|
|
||||||
description: 'Version 2.0 release'
|
|
||||||
due_date: ourtime.now().unix() + 86400 * 90
|
|
||||||
completed: false
|
|
||||||
issues: []
|
|
||||||
},
|
|
||||||
Milestone{
|
|
||||||
name: 'original_milestone'
|
|
||||||
description: 'Original milestone'
|
|
||||||
due_date: ourtime.now().unix() + 86400 * 60
|
|
||||||
completed: false
|
|
||||||
issues: []
|
|
||||||
},
|
|
||||||
]
|
|
||||||
) or { panic('Failed to create project: ${err}') }
|
|
||||||
project = mydb.project.set(project) or { panic('Failed to save project: ${err}') }
|
|
||||||
project_id := project.id
|
|
||||||
|
|
||||||
// Create and save an issue
|
|
||||||
now := ourtime.now().unix()
|
|
||||||
mut issue := mydb.project_issue.new(
|
|
||||||
name: 'STORY-100'
|
|
||||||
description: 'Original description'
|
|
||||||
title: 'Original Title'
|
|
||||||
project_id: project_id
|
|
||||||
issue_type: .story
|
|
||||||
priority: .low
|
|
||||||
status: .open
|
|
||||||
swimlane: 'development'
|
|
||||||
assignees: [u32(1)]
|
|
||||||
reporter: 2
|
|
||||||
milestone: 'original_milestone'
|
|
||||||
deadline: ourtime.new_from_epoch(u64(now + 86400 * 30)).str()
|
|
||||||
estimate: 3
|
|
||||||
fs_files: []u32{}
|
|
||||||
parent_id: 0
|
|
||||||
children: []u32{}
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['original']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create project issue: ${err}') }
|
|
||||||
|
|
||||||
issue = mydb.project_issue.set(issue) or { panic('Failed to save project issue: ${err}') }
|
|
||||||
original_id := issue.id
|
|
||||||
original_created_at := issue.created_at
|
|
||||||
original_updated_at := issue.updated_at
|
|
||||||
|
|
||||||
// Update the issue
|
|
||||||
issue.name = 'STORY-100-UPDATED'
|
|
||||||
issue.description = 'Updated description'
|
|
||||||
issue.title = 'Updated Title'
|
|
||||||
issue.project_id = 2
|
|
||||||
issue.issue_type = .task
|
|
||||||
issue.priority = .critical
|
|
||||||
issue.status = .done
|
|
||||||
issue.swimlane = 'completed'
|
|
||||||
issue.assignees = [u32(5), 6, 7]
|
|
||||||
issue.reporter = 8
|
|
||||||
issue.milestone = 'updated_milestone'
|
|
||||||
issue.deadline = now + 86400 * 7
|
|
||||||
issue.estimate = 13
|
|
||||||
issue.fs_files = [u32(400), 500]
|
|
||||||
|
|
||||||
issue = mydb.project_issue.set(issue) or { panic('Failed to update project issue: ${err}') }
|
|
||||||
|
|
||||||
// Verify ID remains the same and updated_at is set
|
|
||||||
assert issue.id == original_id
|
|
||||||
assert issue.created_at == original_created_at
|
|
||||||
assert issue.updated_at >= original_updated_at
|
|
||||||
|
|
||||||
// Retrieve and verify updates
|
|
||||||
updated_issue := mydb.project_issue.get(issue.id) or {
|
|
||||||
panic('Failed to get updated project issue: ${err}')
|
|
||||||
}
|
|
||||||
assert updated_issue.name == 'STORY-100-UPDATED'
|
|
||||||
assert updated_issue.description == 'Updated description'
|
|
||||||
assert updated_issue.title == 'Updated Title'
|
|
||||||
assert updated_issue.project_id == 2
|
|
||||||
assert updated_issue.issue_type == .task
|
|
||||||
assert updated_issue.priority == .critical
|
|
||||||
assert updated_issue.status == .done
|
|
||||||
assert updated_issue.swimlane == 'completed'
|
|
||||||
assert updated_issue.assignees.len == 3
|
|
||||||
assert updated_issue.assignees[0] == 5
|
|
||||||
assert updated_issue.reporter == 8
|
|
||||||
assert updated_issue.milestone == 'updated_milestone'
|
|
||||||
assert updated_issue.deadline == now + 86400 * 7
|
|
||||||
assert updated_issue.estimate == 13
|
|
||||||
assert updated_issue.fs_files.len == 2
|
|
||||||
}
|
|
||||||
@@ -1,617 +0,0 @@
|
|||||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused test
|
|
||||||
|
|
||||||
module heromodels
|
|
||||||
|
|
||||||
import freeflowuniverse.herolib.hero.heromodels
|
|
||||||
import freeflowuniverse.herolib.data.ourtime
|
|
||||||
|
|
||||||
// Test Project model CRUD operations
|
|
||||||
fn test_project_new() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Test creating a new project with all fields
|
|
||||||
now := ourtime.now().unix()
|
|
||||||
start_date := ourtime.new_from_epoch(u64(u64(now))).str()
|
|
||||||
end_date := ourtime.new_from_epoch(u64(u64(now + 86400 * 90))).str()
|
|
||||||
mut project := mydb.project.new(
|
|
||||||
name: 'Web Application Redesign'
|
|
||||||
description: 'Complete redesign of the company website'
|
|
||||||
swimlanes: [
|
|
||||||
Swimlane{
|
|
||||||
name: 'backlog'
|
|
||||||
description: 'Items waiting to be started'
|
|
||||||
order: 1
|
|
||||||
color: '#CCCCCC'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
Swimlane{
|
|
||||||
name: 'in_progress'
|
|
||||||
description: 'Currently being worked on'
|
|
||||||
order: 2
|
|
||||||
color: '#FFFF00'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
Swimlane{
|
|
||||||
name: 'done'
|
|
||||||
description: 'Completed items'
|
|
||||||
order: 3
|
|
||||||
color: '#00FF00'
|
|
||||||
is_done: true
|
|
||||||
},
|
|
||||||
]
|
|
||||||
milestones: [
|
|
||||||
Milestone{
|
|
||||||
name: 'design_complete'
|
|
||||||
description: 'All design mockups completed'
|
|
||||||
due_date: now + 86400 * 30 // 30 days from now
|
|
||||||
completed: false
|
|
||||||
issues: [u32(1), 2, 3]
|
|
||||||
},
|
|
||||||
Milestone{
|
|
||||||
name: 'development_complete'
|
|
||||||
description: 'All development work finished'
|
|
||||||
due_date: now + 86400 * 60 // 60 days from now
|
|
||||||
completed: false
|
|
||||||
issues: [u32(4), 5, 6]
|
|
||||||
},
|
|
||||||
]
|
|
||||||
issues: ['issue-1', 'issue-2', 'issue-3']
|
|
||||||
fs_files: [u32(100), 200, 300]
|
|
||||||
status: .planning
|
|
||||||
start_date: start_date
|
|
||||||
end_date: end_date
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['web', 'redesign', 'frontend']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create project: ${err}') }
|
|
||||||
|
|
||||||
// Verify the project was created with correct values
|
|
||||||
assert project.name == 'Web Application Redesign'
|
|
||||||
assert project.description == 'Complete redesign of the company website'
|
|
||||||
assert project.swimlanes.len == 3
|
|
||||||
assert project.swimlanes[0].name == 'backlog'
|
|
||||||
assert project.swimlanes[1].name == 'in_progress'
|
|
||||||
assert project.swimlanes[2].name == 'done'
|
|
||||||
assert project.swimlanes[2].is_done == true
|
|
||||||
assert project.milestones.len == 2
|
|
||||||
assert project.milestones[0].name == 'design_complete'
|
|
||||||
assert project.milestones[0].issues.len == 3
|
|
||||||
assert project.issues.len == 3
|
|
||||||
assert project.issues[0] == 'issue-1'
|
|
||||||
assert project.fs_files.len == 3
|
|
||||||
assert project.status == .planning
|
|
||||||
assert project.start_date > 0 // Should have valid timestamp
|
|
||||||
assert project.end_date > project.start_date // End should be after start
|
|
||||||
assert project.id == 0 // Should be 0 before saving
|
|
||||||
assert project.updated_at > 0 // Should have timestamp
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_project_set_and_get() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create a project
|
|
||||||
now := ourtime.now().unix()
|
|
||||||
start_date := ourtime.new_from_epoch(u64(u64(now - 86400 * 7))).str()
|
|
||||||
end_date := ourtime.new_from_epoch(u64(u64(now + 86400 * 120))).str()
|
|
||||||
mut project := mydb.project.new(
|
|
||||||
name: 'Mobile App Development'
|
|
||||||
description: 'Native mobile application for iOS and Android'
|
|
||||||
swimlanes: [
|
|
||||||
Swimlane{
|
|
||||||
name: 'todo'
|
|
||||||
description: 'Tasks to be done'
|
|
||||||
order: 1
|
|
||||||
color: '#FF0000'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
Swimlane{
|
|
||||||
name: 'testing'
|
|
||||||
description: 'Items being tested'
|
|
||||||
order: 2
|
|
||||||
color: '#0000FF'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
]
|
|
||||||
milestones: [
|
|
||||||
Milestone{
|
|
||||||
name: 'mvp_release'
|
|
||||||
description: 'Minimum viable product release'
|
|
||||||
due_date: now + 86400 * 45
|
|
||||||
completed: false
|
|
||||||
issues: [u32(10), 11, 12]
|
|
||||||
},
|
|
||||||
]
|
|
||||||
issues: ['mobile-1', 'mobile-2']
|
|
||||||
fs_files: [u32(500), 600]
|
|
||||||
status: .active
|
|
||||||
start_date: start_date
|
|
||||||
end_date: end_date
|
|
||||||
securitypolicy: 2
|
|
||||||
tags: ['mobile', 'ios', 'android']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create project: ${err}') }
|
|
||||||
|
|
||||||
// Save the project
|
|
||||||
project = mydb.project.set(project) or { panic('Failed to save project: ${err}') }
|
|
||||||
|
|
||||||
// Verify ID was assigned
|
|
||||||
assert project.id > 0
|
|
||||||
original_id := project.id
|
|
||||||
|
|
||||||
// Retrieve the project
|
|
||||||
retrieved_project := mydb.project.get(project.id) or { panic('Failed to get project: ${err}') }
|
|
||||||
|
|
||||||
// Verify all fields match
|
|
||||||
assert retrieved_project.id == original_id
|
|
||||||
assert retrieved_project.name == 'Mobile App Development'
|
|
||||||
assert retrieved_project.description == 'Native mobile application for iOS and Android'
|
|
||||||
assert retrieved_project.swimlanes.len == 2
|
|
||||||
assert retrieved_project.swimlanes[0].name == 'todo'
|
|
||||||
assert retrieved_project.swimlanes[0].color == '#FF0000'
|
|
||||||
assert retrieved_project.swimlanes[1].name == 'testing'
|
|
||||||
assert retrieved_project.swimlanes[1].color == '#0000FF'
|
|
||||||
assert retrieved_project.milestones.len == 1
|
|
||||||
assert retrieved_project.milestones[0].name == 'mvp_release'
|
|
||||||
assert retrieved_project.milestones[0].issues.len == 3
|
|
||||||
assert retrieved_project.issues.len == 2
|
|
||||||
assert retrieved_project.issues[0] == 'mobile-1'
|
|
||||||
assert retrieved_project.fs_files.len == 2
|
|
||||||
assert retrieved_project.status == .active
|
|
||||||
assert retrieved_project.start_date > 0 // Should have valid timestamp
|
|
||||||
assert retrieved_project.end_date > retrieved_project.start_date // End should be after start
|
|
||||||
assert retrieved_project.created_at > 0
|
|
||||||
assert retrieved_project.updated_at > 0
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_project_status_transitions() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Test all project status values
|
|
||||||
statuses := [heromodels.ProjectStatus.planning, .active, .on_hold, .completed, .cancelled]
|
|
||||||
now := ourtime.now().unix()
|
|
||||||
start_date := ourtime.new_from_epoch(u64(u64(now))).str()
|
|
||||||
end_date := ourtime.new_from_epoch(u64(u64(now + 86400 * 30))).str()
|
|
||||||
|
|
||||||
for status in statuses {
|
|
||||||
mut project := mydb.project.new(
|
|
||||||
name: 'Project ${status}'
|
|
||||||
description: 'Testing status ${status}'
|
|
||||||
swimlanes: [
|
|
||||||
Swimlane{
|
|
||||||
name: 'test'
|
|
||||||
description: 'Test swimlane'
|
|
||||||
order: 1
|
|
||||||
color: '#000000'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
]
|
|
||||||
milestones: []Milestone{}
|
|
||||||
issues: []string{}
|
|
||||||
fs_files: []u32{}
|
|
||||||
status: heromodels.ProjectStatus(status)
|
|
||||||
start_date: start_date
|
|
||||||
end_date: end_date
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['test']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create project with status ${status}: ${err}') }
|
|
||||||
|
|
||||||
project = mydb.project.set(project) or {
|
|
||||||
panic('Failed to save project with status ${status}: ${err}')
|
|
||||||
}
|
|
||||||
|
|
||||||
retrieved_project := mydb.project.get(project.id) or {
|
|
||||||
panic('Failed to get project with status ${status}: ${err}')
|
|
||||||
}
|
|
||||||
assert retrieved_project.status == heromodels.ProjectStatus(status)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_project_swimlanes() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create project with complex swimlanes
|
|
||||||
mut project := mydb.project.new(
|
|
||||||
name: 'Swimlane Test Project'
|
|
||||||
description: 'Testing swimlane functionality'
|
|
||||||
swimlanes: [
|
|
||||||
Swimlane{
|
|
||||||
name: 'backlog'
|
|
||||||
description: 'Product backlog items'
|
|
||||||
order: 1
|
|
||||||
color: '#EEEEEE'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
Swimlane{
|
|
||||||
name: 'sprint_planning'
|
|
||||||
description: 'Items being planned for sprint'
|
|
||||||
order: 2
|
|
||||||
color: '#FFCC00'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
Swimlane{
|
|
||||||
name: 'in_development'
|
|
||||||
description: 'Currently being developed'
|
|
||||||
order: 3
|
|
||||||
color: '#FF6600'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
Swimlane{
|
|
||||||
name: 'code_review'
|
|
||||||
description: 'Under code review'
|
|
||||||
order: 4
|
|
||||||
color: '#3366FF'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
Swimlane{
|
|
||||||
name: 'testing'
|
|
||||||
description: 'Being tested'
|
|
||||||
order: 5
|
|
||||||
color: '#9933FF'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
Swimlane{
|
|
||||||
name: 'done'
|
|
||||||
description: 'Completed and deployed'
|
|
||||||
order: 6
|
|
||||||
color: '#00CC00'
|
|
||||||
is_done: true
|
|
||||||
},
|
|
||||||
]
|
|
||||||
milestones: []Milestone{}
|
|
||||||
issues: []string{}
|
|
||||||
fs_files: []u32{}
|
|
||||||
status: .active
|
|
||||||
start_date: ourtime.now().str()
|
|
||||||
end_date: ourtime.new_from_epoch(u64(ourtime.now().unix() + 86400 * 60)).str()
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['agile', 'scrum']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create project: ${err}') }
|
|
||||||
|
|
||||||
project = mydb.project.set(project) or { panic('Failed to save project: ${err}') }
|
|
||||||
|
|
||||||
retrieved_project := mydb.project.get(project.id) or { panic('Failed to get project: ${err}') }
|
|
||||||
|
|
||||||
// Verify all swimlanes are preserved
|
|
||||||
assert retrieved_project.swimlanes.len == 6
|
|
||||||
|
|
||||||
// Check specific swimlanes
|
|
||||||
assert retrieved_project.swimlanes[0].name == 'backlog'
|
|
||||||
assert retrieved_project.swimlanes[0].order == 1
|
|
||||||
assert retrieved_project.swimlanes[0].is_done == false
|
|
||||||
|
|
||||||
assert retrieved_project.swimlanes[5].name == 'done'
|
|
||||||
assert retrieved_project.swimlanes[5].order == 6
|
|
||||||
assert retrieved_project.swimlanes[5].is_done == true
|
|
||||||
assert retrieved_project.swimlanes[5].color == '#00CC00'
|
|
||||||
|
|
||||||
// Verify order is preserved
|
|
||||||
for i in 0 .. retrieved_project.swimlanes.len - 1 {
|
|
||||||
assert retrieved_project.swimlanes[i].order <= retrieved_project.swimlanes[i + 1].order
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_project_milestones() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create project with multiple milestones
|
|
||||||
now := ourtime.now().unix()
|
|
||||||
mut project := mydb.project.new(
|
|
||||||
name: 'Milestone Test Project'
|
|
||||||
description: 'Testing milestone functionality'
|
|
||||||
swimlanes: []Swimlane{}
|
|
||||||
milestones: [
|
|
||||||
Milestone{
|
|
||||||
name: 'alpha_release'
|
|
||||||
description: 'First alpha version'
|
|
||||||
due_date: now + 86400 * 30
|
|
||||||
completed: true
|
|
||||||
issues: [u32(1), 2, 3, 4, 5]
|
|
||||||
},
|
|
||||||
Milestone{
|
|
||||||
name: 'beta_release'
|
|
||||||
description: 'Beta version with all features'
|
|
||||||
due_date: now + 86400 * 60
|
|
||||||
completed: false
|
|
||||||
issues: [u32(6), 7, 8, 9, 10, 11, 12]
|
|
||||||
},
|
|
||||||
Milestone{
|
|
||||||
name: 'production_release'
|
|
||||||
description: 'Final production release'
|
|
||||||
due_date: now + 86400 * 90
|
|
||||||
completed: false
|
|
||||||
issues: [u32(13), 14, 15]
|
|
||||||
},
|
|
||||||
]
|
|
||||||
issues: []string{}
|
|
||||||
fs_files: []u32{}
|
|
||||||
status: .active
|
|
||||||
start_date: ourtime.new_from_epoch(u64(now - 86400 * 10)).str()
|
|
||||||
end_date: ourtime.new_from_epoch(u64(now + 86400 * 100)).str()
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['release', 'milestones']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create project: ${err}') }
|
|
||||||
|
|
||||||
project = mydb.project.set(project) or { panic('Failed to save project: ${err}') }
|
|
||||||
|
|
||||||
retrieved_project := mydb.project.get(project.id) or { panic('Failed to get project: ${err}') }
|
|
||||||
|
|
||||||
// Verify all milestones are preserved
|
|
||||||
assert retrieved_project.milestones.len == 3
|
|
||||||
|
|
||||||
// Check specific milestones
|
|
||||||
assert retrieved_project.milestones[0].name == 'alpha_release'
|
|
||||||
assert retrieved_project.milestones[0].completed == true
|
|
||||||
assert retrieved_project.milestones[0].issues.len == 5
|
|
||||||
assert retrieved_project.milestones[0].due_date == now + 86400 * 30
|
|
||||||
|
|
||||||
assert retrieved_project.milestones[1].name == 'beta_release'
|
|
||||||
assert retrieved_project.milestones[1].completed == false
|
|
||||||
assert retrieved_project.milestones[1].issues.len == 7
|
|
||||||
|
|
||||||
assert retrieved_project.milestones[2].name == 'production_release'
|
|
||||||
assert retrieved_project.milestones[2].completed == false
|
|
||||||
assert retrieved_project.milestones[2].issues.len == 3
|
|
||||||
assert retrieved_project.milestones[2].due_date == now + 86400 * 90
|
|
||||||
|
|
||||||
// Verify due dates are in chronological order
|
|
||||||
assert retrieved_project.milestones[0].due_date <= retrieved_project.milestones[1].due_date
|
|
||||||
assert retrieved_project.milestones[1].due_date <= retrieved_project.milestones[2].due_date
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_project_update() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create and save a project
|
|
||||||
now := ourtime.now().unix()
|
|
||||||
mut project := mydb.project.new(
|
|
||||||
name: 'Original Project'
|
|
||||||
description: 'Original description'
|
|
||||||
swimlanes: [
|
|
||||||
Swimlane{
|
|
||||||
name: 'todo'
|
|
||||||
description: 'To do items'
|
|
||||||
order: 1
|
|
||||||
color: '#FF0000'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
]
|
|
||||||
milestones: []Milestone{}
|
|
||||||
issues: ['original-1']
|
|
||||||
fs_files: [u32(100)]
|
|
||||||
status: .planning
|
|
||||||
start_date: ourtime.new_from_epoch(u64(now)).str()
|
|
||||||
end_date: ourtime.new_from_epoch(u64(now + 86400 * 30)).str()
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['original']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create project: ${err}') }
|
|
||||||
|
|
||||||
project = mydb.project.set(project) or { panic('Failed to save project: ${err}') }
|
|
||||||
original_id := project.id
|
|
||||||
original_created_at := project.created_at
|
|
||||||
original_updated_at := project.updated_at
|
|
||||||
|
|
||||||
// Update the project
|
|
||||||
project.name = 'Updated Project'
|
|
||||||
project.description = 'Updated description'
|
|
||||||
project.swimlanes = [
|
|
||||||
Swimlane{
|
|
||||||
name: 'backlog'
|
|
||||||
description: 'Product backlog'
|
|
||||||
order: 1
|
|
||||||
color: '#CCCCCC'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
Swimlane{
|
|
||||||
name: 'done'
|
|
||||||
description: 'Completed items'
|
|
||||||
order: 2
|
|
||||||
color: '#00FF00'
|
|
||||||
is_done: true
|
|
||||||
},
|
|
||||||
]
|
|
||||||
project.milestones = [
|
|
||||||
Milestone{
|
|
||||||
name: 'v1_release'
|
|
||||||
description: 'Version 1.0 release'
|
|
||||||
due_date: now + 86400 * 60
|
|
||||||
completed: false
|
|
||||||
issues: [u32(1), 2, 3]
|
|
||||||
},
|
|
||||||
]
|
|
||||||
project.issues = ['updated-1', 'updated-2']
|
|
||||||
project.fs_files = [u32(200), 300]
|
|
||||||
project.status = .active
|
|
||||||
project.end_date = now + 86400 * 60
|
|
||||||
|
|
||||||
project = mydb.project.set(project) or { panic('Failed to update project: ${err}') }
|
|
||||||
|
|
||||||
// Verify ID remains the same and updated_at is set
|
|
||||||
assert project.id == original_id
|
|
||||||
assert project.created_at == original_created_at
|
|
||||||
assert project.updated_at >= original_updated_at
|
|
||||||
|
|
||||||
// Retrieve and verify updates
|
|
||||||
updated_project := mydb.project.get(project.id) or {
|
|
||||||
panic('Failed to get updated project: ${err}')
|
|
||||||
}
|
|
||||||
assert updated_project.name == 'Updated Project'
|
|
||||||
assert updated_project.description == 'Updated description'
|
|
||||||
assert updated_project.swimlanes.len == 2
|
|
||||||
assert updated_project.swimlanes[0].name == 'backlog'
|
|
||||||
assert updated_project.swimlanes[1].name == 'done'
|
|
||||||
assert updated_project.swimlanes[1].is_done == true
|
|
||||||
assert updated_project.milestones.len == 1
|
|
||||||
assert updated_project.milestones[0].name == 'v1_release'
|
|
||||||
assert updated_project.issues.len == 2
|
|
||||||
assert updated_project.issues[0] == 'updated-1'
|
|
||||||
assert updated_project.fs_files.len == 2
|
|
||||||
assert updated_project.status == .active
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_project_exist() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Test non-existent project
|
|
||||||
exists := mydb.project.exist(999) or { panic('Failed to check existence: ${err}') }
|
|
||||||
assert exists == false
|
|
||||||
|
|
||||||
// Create and save a project
|
|
||||||
mut project := mydb.project.new(
|
|
||||||
name: 'Existence Test'
|
|
||||||
description: 'Testing existence'
|
|
||||||
swimlanes: []Swimlane{}
|
|
||||||
milestones: []Milestone{}
|
|
||||||
issues: []string{}
|
|
||||||
fs_files: []u32{}
|
|
||||||
status: .planning
|
|
||||||
start_date: ourtime.now().str()
|
|
||||||
end_date: ourtime.new_from_epoch(u64(ourtime.now().unix() + 86400 * 30)).str()
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['test']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create project: ${err}') }
|
|
||||||
|
|
||||||
project = mydb.project.set(project) or { panic('Failed to save project: ${err}') }
|
|
||||||
|
|
||||||
// Test existing project
|
|
||||||
exists_after_save := mydb.project.exist(project.id) or {
|
|
||||||
panic('Failed to check existence: ${err}')
|
|
||||||
}
|
|
||||||
assert exists_after_save == true
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_project_delete() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create and save a project
|
|
||||||
mut project := mydb.project.new(
|
|
||||||
name: 'To Be Deleted'
|
|
||||||
description: 'This project will be deleted'
|
|
||||||
swimlanes: []Swimlane{}
|
|
||||||
milestones: []Milestone{}
|
|
||||||
issues: []string{}
|
|
||||||
fs_files: []u32{}
|
|
||||||
status: .cancelled
|
|
||||||
start_date: ourtime.now().str()
|
|
||||||
end_date: ourtime.new_from_epoch(u64(ourtime.now().unix() + 86400 * 30)).str()
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: []
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create project: ${err}') }
|
|
||||||
|
|
||||||
project = mydb.project.set(project) or { panic('Failed to save project: ${err}') }
|
|
||||||
project_id := project.id
|
|
||||||
|
|
||||||
// Verify it exists
|
|
||||||
exists_before := mydb.project.exist(project_id) or {
|
|
||||||
panic('Failed to check existence: ${err}')
|
|
||||||
}
|
|
||||||
assert exists_before == true
|
|
||||||
|
|
||||||
// Delete the project
|
|
||||||
mydb.project.delete(project_id) or { panic('Failed to delete project: ${err}') }
|
|
||||||
|
|
||||||
// Verify it no longer exists
|
|
||||||
exists_after := mydb.project.exist(project_id) or { panic('Failed to check existence: ${err}') }
|
|
||||||
assert exists_after == false
|
|
||||||
|
|
||||||
// Verify get fails
|
|
||||||
if _ := mydb.project.get(project_id) {
|
|
||||||
panic('Should not be able to get deleted project')
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_project_list() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Clear any existing projects by creating a fresh DB
|
|
||||||
mydb = heromodels.new() or { panic('Failed to create fresh DB: ${err}') }
|
|
||||||
|
|
||||||
// Initially should be empty
|
|
||||||
initial_list := mydb.project.list() or { panic('Failed to list projects: ${err}') }
|
|
||||||
initial_count := initial_list.len
|
|
||||||
|
|
||||||
// Create multiple projects
|
|
||||||
now := ourtime.now().unix()
|
|
||||||
mut project1 := mydb.project.new(
|
|
||||||
name: 'Project Alpha'
|
|
||||||
description: 'First test project'
|
|
||||||
swimlanes: [
|
|
||||||
Swimlane{
|
|
||||||
name: 'todo'
|
|
||||||
description: 'To do'
|
|
||||||
order: 1
|
|
||||||
color: '#FF0000'
|
|
||||||
is_done: false
|
|
||||||
},
|
|
||||||
]
|
|
||||||
milestones: []Milestone{}
|
|
||||||
issues: ['alpha-1']
|
|
||||||
fs_files: []u32{}
|
|
||||||
status: .active
|
|
||||||
start_date: ourtime.new_from_epoch(u64(now)).str()
|
|
||||||
end_date: ourtime.new_from_epoch(u64(now + 86400 * 30)).str()
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: ['alpha', 'test']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create project1: ${err}') }
|
|
||||||
|
|
||||||
mut project2 := mydb.project.new(
|
|
||||||
name: 'Project Beta'
|
|
||||||
description: 'Second test project'
|
|
||||||
swimlanes: []Swimlane{}
|
|
||||||
milestones: [
|
|
||||||
Milestone{
|
|
||||||
name: 'beta_milestone'
|
|
||||||
description: 'Beta milestone'
|
|
||||||
due_date: now + 86400 * 45
|
|
||||||
completed: false
|
|
||||||
issues: [u32(1), 2]
|
|
||||||
},
|
|
||||||
]
|
|
||||||
issues: ['beta-1', 'beta-2']
|
|
||||||
fs_files: [u32(100), 200]
|
|
||||||
status: .planning
|
|
||||||
start_date: ourtime.new_from_epoch(u64(now + 86400 * 7)).str()
|
|
||||||
end_date: ourtime.new_from_epoch(u64(now + 86400 * 60)).str()
|
|
||||||
securitypolicy: 2
|
|
||||||
tags: ['beta', 'test']
|
|
||||||
comments: []
|
|
||||||
) or { panic('Failed to create project2: ${err}') }
|
|
||||||
|
|
||||||
// Save both projects
|
|
||||||
project1 = mydb.project.set(project1) or { panic('Failed to save project1: ${err}') }
|
|
||||||
project2 = mydb.project.set(project2) or { panic('Failed to save project2: ${err}') }
|
|
||||||
|
|
||||||
// List projects
|
|
||||||
project_list := mydb.project.list() or { panic('Failed to list projects: ${err}') }
|
|
||||||
|
|
||||||
// Should have 2 more projects than initially
|
|
||||||
assert project_list.len == initial_count + 2
|
|
||||||
|
|
||||||
// Find our projects in the list
|
|
||||||
mut found_project1 := false
|
|
||||||
mut found_project2 := false
|
|
||||||
|
|
||||||
for proj in project_list {
|
|
||||||
if proj.name == 'Project Alpha' {
|
|
||||||
found_project1 = true
|
|
||||||
assert proj.status == .active
|
|
||||||
assert proj.swimlanes.len == 1
|
|
||||||
assert proj.issues.len == 1
|
|
||||||
}
|
|
||||||
if proj.name == 'Project Beta' {
|
|
||||||
found_project2 = true
|
|
||||||
assert proj.status == .planning
|
|
||||||
assert proj.milestones.len == 1
|
|
||||||
assert proj.issues.len == 2
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
assert found_project1 == true
|
|
||||||
assert found_project2 == true
|
|
||||||
}
|
|
||||||
@@ -78,7 +78,7 @@ pub fn (self User) example(methodname string) (string, string) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn (self User) dump(mut e encoder.Encoder) ! {
|
fn (self User) dump(mut e encoder.Encoder) ! {
|
||||||
e.add_string(self.email)
|
e.add_string(self.email)
|
||||||
e.add_string(self.public_key)
|
e.add_string(self.public_key)
|
||||||
e.add_string(self.phone)
|
e.add_string(self.phone)
|
||||||
@@ -123,6 +123,13 @@ pub mut:
|
|||||||
db &db.DB @[skip; str: skip]
|
db &db.DB @[skip; str: skip]
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@[params]
|
||||||
|
pub struct UserListArg {
|
||||||
|
pub mut:
|
||||||
|
status UserStatus
|
||||||
|
limit int = 100 // Default limit is 100
|
||||||
|
}
|
||||||
|
|
||||||
// get new user, not from the DB
|
// get new user, not from the DB
|
||||||
pub fn (mut self DBUser) new(args UserArg) !User {
|
pub fn (mut self DBUser) new(args UserArg) !User {
|
||||||
mut o := User{
|
mut o := User{
|
||||||
@@ -166,6 +173,34 @@ pub fn (mut self DBUser) get(id u32) !User {
|
|||||||
return o
|
return o
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn (mut self DBUser) list() ![]User {
|
pub fn (mut self DBUser) list(args UserListArg) ![]User {
|
||||||
return self.db.list[User]()!.map(self.get(it)!)
|
// Require at least one parameter to be provided
|
||||||
|
if args.status == .active {
|
||||||
|
return error('At least one filter parameter must be provided')
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get all users from the database
|
||||||
|
all_users := self.db.list[User]()!.map(self.get(it)!)
|
||||||
|
|
||||||
|
// Apply filters
|
||||||
|
mut filtered_users := []User{}
|
||||||
|
for user in all_users {
|
||||||
|
// Filter by status if provided (status is not active)
|
||||||
|
if args.status != .active && user.status != args.status {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
filtered_users << user
|
||||||
|
}
|
||||||
|
|
||||||
|
// Limit results to 100 or the specified limit
|
||||||
|
limit := args.limit
|
||||||
|
if limit > 100 {
|
||||||
|
limit = 100
|
||||||
|
}
|
||||||
|
if filtered_users.len > limit {
|
||||||
|
return filtered_users[..limit]
|
||||||
|
}
|
||||||
|
|
||||||
|
return filtered_users
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,370 +0,0 @@
|
|||||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused test
|
|
||||||
|
|
||||||
module heromodels
|
|
||||||
|
|
||||||
import freeflowuniverse.herolib.hero.heromodels
|
|
||||||
|
|
||||||
// Test User model CRUD operations
|
|
||||||
fn test_user_new() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Test creating a new user with all fields
|
|
||||||
mut user := mydb.user.new(
|
|
||||||
name: 'John Doe'
|
|
||||||
description: 'Software developer and tech enthusiast'
|
|
||||||
email: 'john.doe@example.com'
|
|
||||||
public_key: 'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQ...'
|
|
||||||
phone: '+1-555-123-4567'
|
|
||||||
address: '123 Main St, Anytown, USA'
|
|
||||||
avatar_url: 'https://example.com/avatar.jpg'
|
|
||||||
bio: 'Passionate about technology and open source'
|
|
||||||
timezone: 'America/New_York'
|
|
||||||
status: .active
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: 1
|
|
||||||
comments: []u32{}
|
|
||||||
) or { panic('Failed to create user: ${err}') }
|
|
||||||
|
|
||||||
// Verify the user was created with correct values
|
|
||||||
assert user.name == 'John Doe'
|
|
||||||
assert user.description == 'Software developer and tech enthusiast'
|
|
||||||
assert user.email == 'john.doe@example.com'
|
|
||||||
assert user.public_key == 'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQ...'
|
|
||||||
assert user.phone == '+1-555-123-4567'
|
|
||||||
assert user.address == '123 Main St, Anytown, USA'
|
|
||||||
assert user.avatar_url == 'https://example.com/avatar.jpg'
|
|
||||||
assert user.bio == 'Passionate about technology and open source'
|
|
||||||
assert user.timezone == 'America/New_York'
|
|
||||||
assert user.status == .active
|
|
||||||
assert user.id == 0 // Should be 0 before saving
|
|
||||||
assert user.updated_at > 0 // Should have timestamp
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_user_set_and_get() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create a user
|
|
||||||
mut user := mydb.user.new(
|
|
||||||
name: 'Alice Smith'
|
|
||||||
description: 'Product manager with 5 years experience'
|
|
||||||
email: 'alice.smith@company.com'
|
|
||||||
public_key: 'ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAI...'
|
|
||||||
phone: '+44-20-7946-0958'
|
|
||||||
address: 'London, UK'
|
|
||||||
avatar_url: 'https://company.com/avatars/alice.png'
|
|
||||||
bio: 'Leading product development teams'
|
|
||||||
timezone: 'Europe/London'
|
|
||||||
status: .active
|
|
||||||
securitypolicy: 2
|
|
||||||
tags: 2
|
|
||||||
comments: []u32{}
|
|
||||||
) or { panic('Failed to create user: ${err}') }
|
|
||||||
|
|
||||||
// Save the user
|
|
||||||
user = mydb.user.set(user) or { panic('Failed to save user: ${err}') }
|
|
||||||
|
|
||||||
// Verify ID was assigned
|
|
||||||
assert user.id > 0
|
|
||||||
original_id := user.id
|
|
||||||
|
|
||||||
// Retrieve the user
|
|
||||||
retrieved_user := mydb.user.get(user.id) or { panic('Failed to get user: ${err}') }
|
|
||||||
|
|
||||||
// Verify all fields match
|
|
||||||
assert retrieved_user.id == original_id
|
|
||||||
assert retrieved_user.name == 'Alice Smith'
|
|
||||||
assert retrieved_user.description == 'Product manager with 5 years experience'
|
|
||||||
assert retrieved_user.email == 'alice.smith@company.com'
|
|
||||||
assert retrieved_user.public_key == 'ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAI...'
|
|
||||||
assert retrieved_user.phone == '+44-20-7946-0958'
|
|
||||||
assert retrieved_user.address == 'London, UK'
|
|
||||||
assert retrieved_user.avatar_url == 'https://company.com/avatars/alice.png'
|
|
||||||
assert retrieved_user.bio == 'Leading product development teams'
|
|
||||||
assert retrieved_user.timezone == 'Europe/London'
|
|
||||||
assert retrieved_user.status == .active
|
|
||||||
assert retrieved_user.created_at > 0
|
|
||||||
assert retrieved_user.updated_at > 0
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_user_update() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create and save a user
|
|
||||||
mut user := mydb.user.new(
|
|
||||||
name: 'Bob Wilson'
|
|
||||||
description: 'Junior developer'
|
|
||||||
email: 'bob.wilson@startup.com'
|
|
||||||
public_key: 'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQ...'
|
|
||||||
phone: '+1-415-555-0123'
|
|
||||||
address: 'San Francisco, CA'
|
|
||||||
avatar_url: 'https://startup.com/bob.jpg'
|
|
||||||
bio: 'Learning and growing in tech'
|
|
||||||
timezone: 'America/Los_Angeles'
|
|
||||||
status: .pending
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: 3
|
|
||||||
comments: []u32{}
|
|
||||||
) or { panic('Failed to create user: ${err}') }
|
|
||||||
|
|
||||||
user = mydb.user.set(user) or { panic('Failed to save user: ${err}') }
|
|
||||||
original_id := user.id
|
|
||||||
original_created_at := user.created_at
|
|
||||||
original_updated_at := user.updated_at
|
|
||||||
|
|
||||||
// Update the user
|
|
||||||
user.name = 'Bob Wilson Jr.'
|
|
||||||
user.description = 'Senior developer'
|
|
||||||
user.email = 'bob.wilson.jr@bigtech.com'
|
|
||||||
user.phone = '+1-415-555-9999'
|
|
||||||
user.address = 'Palo Alto, CA'
|
|
||||||
user.bio = 'Experienced full-stack developer'
|
|
||||||
user.status = .active
|
|
||||||
|
|
||||||
user = mydb.user.set(user) or { panic('Failed to update user: ${err}') }
|
|
||||||
|
|
||||||
// Verify ID remains the same and updated_at is set
|
|
||||||
assert user.id == original_id
|
|
||||||
assert user.created_at == original_created_at
|
|
||||||
assert user.updated_at >= original_updated_at
|
|
||||||
|
|
||||||
// Retrieve and verify updates
|
|
||||||
updated_user := mydb.user.get(user.id) or { panic('Failed to get updated user: ${err}') }
|
|
||||||
assert updated_user.name == 'Bob Wilson Jr.'
|
|
||||||
assert updated_user.description == 'Senior developer'
|
|
||||||
assert updated_user.email == 'bob.wilson.jr@bigtech.com'
|
|
||||||
assert updated_user.phone == '+1-415-555-9999'
|
|
||||||
assert updated_user.address == 'Palo Alto, CA'
|
|
||||||
assert updated_user.bio == 'Experienced full-stack developer'
|
|
||||||
assert updated_user.status == .active
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_user_status_transitions() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Test all user status values
|
|
||||||
statuses := [heromodels.UserStatus.pending, .active, .inactive, .suspended]
|
|
||||||
|
|
||||||
for status in statuses {
|
|
||||||
mut user := mydb.user.new(
|
|
||||||
name: 'Test User ${status}'
|
|
||||||
description: 'Testing status ${status}'
|
|
||||||
email: 'test.${status}@example.com'
|
|
||||||
public_key: ''
|
|
||||||
phone: ''
|
|
||||||
address: ''
|
|
||||||
avatar_url: ''
|
|
||||||
bio: ''
|
|
||||||
timezone: 'UTC'
|
|
||||||
status: status
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: 0
|
|
||||||
comments: []u32{}
|
|
||||||
) or { panic('Failed to create user with status ${status}: ${err}') }
|
|
||||||
|
|
||||||
user = mydb.user.set(user) or { panic('Failed to save user with status ${status}: ${err}') }
|
|
||||||
|
|
||||||
retrieved_user := mydb.user.get(user.id) or {
|
|
||||||
panic('Failed to get user with status ${status}: ${err}')
|
|
||||||
}
|
|
||||||
assert retrieved_user.status == status
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_user_exist() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Test non-existent user
|
|
||||||
exists := mydb.user.exist(999) or { panic('Failed to check existence: ${err}') }
|
|
||||||
assert exists == false
|
|
||||||
|
|
||||||
// Create and save a user
|
|
||||||
mut user := mydb.user.new(
|
|
||||||
name: 'Existence Test'
|
|
||||||
description: 'Testing existence'
|
|
||||||
email: 'exist@test.com'
|
|
||||||
public_key: ''
|
|
||||||
phone: ''
|
|
||||||
address: ''
|
|
||||||
avatar_url: ''
|
|
||||||
bio: ''
|
|
||||||
timezone: 'UTC'
|
|
||||||
status: .active
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: 4
|
|
||||||
comments: []u32{}
|
|
||||||
) or { panic('Failed to create user: ${err}') }
|
|
||||||
|
|
||||||
user = mydb.user.set(user) or { panic('Failed to save user: ${err}') }
|
|
||||||
|
|
||||||
// Test existing user
|
|
||||||
exists_after_save := mydb.user.exist(user.id) or { panic('Failed to check existence: ${err}') }
|
|
||||||
assert exists_after_save == true
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_user_delete() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Create and save a user
|
|
||||||
mut user := mydb.user.new(
|
|
||||||
name: 'To Be Deleted'
|
|
||||||
description: 'This user will be deleted'
|
|
||||||
email: 'delete@me.com'
|
|
||||||
public_key: ''
|
|
||||||
phone: ''
|
|
||||||
address: ''
|
|
||||||
avatar_url: ''
|
|
||||||
bio: ''
|
|
||||||
timezone: 'UTC'
|
|
||||||
status: .inactive
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: 0
|
|
||||||
comments: []u32{}
|
|
||||||
) or { panic('Failed to create user: ${err}') }
|
|
||||||
|
|
||||||
user = mydb.user.set(user) or { panic('Failed to save user: ${err}') }
|
|
||||||
user_id := user.id
|
|
||||||
|
|
||||||
// Verify it exists
|
|
||||||
exists_before := mydb.user.exist(user_id) or { panic('Failed to check existence: ${err}') }
|
|
||||||
assert exists_before == true
|
|
||||||
|
|
||||||
// Delete the user
|
|
||||||
mydb.user.delete(user_id) or { panic('Failed to delete user: ${err}') }
|
|
||||||
|
|
||||||
// Verify it no longer exists
|
|
||||||
exists_after := mydb.user.exist(user_id) or { panic('Failed to check existence: ${err}') }
|
|
||||||
assert exists_after == false
|
|
||||||
|
|
||||||
// Verify get fails
|
|
||||||
if _ := mydb.user.get(user_id) {
|
|
||||||
panic('Should not be able to get deleted user')
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_user_list() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Clear any existing users by creating a fresh DB
|
|
||||||
mydb = heromodels.new() or { panic('Failed to create fresh DB: ${err}') }
|
|
||||||
|
|
||||||
// Initially should be empty
|
|
||||||
initial_list := mydb.user.list() or { panic('Failed to list users: ${err}') }
|
|
||||||
initial_count := initial_list.len
|
|
||||||
|
|
||||||
// Create multiple users
|
|
||||||
mut user1 := mydb.user.new(
|
|
||||||
name: 'User One'
|
|
||||||
description: 'First test user'
|
|
||||||
email: 'user1@test.com'
|
|
||||||
public_key: 'key1'
|
|
||||||
phone: '+1-111-111-1111'
|
|
||||||
address: 'Address 1'
|
|
||||||
avatar_url: 'avatar1.jpg'
|
|
||||||
bio: 'Bio 1'
|
|
||||||
timezone: 'UTC'
|
|
||||||
status: .active
|
|
||||||
securitypolicy: 1
|
|
||||||
tags: 5
|
|
||||||
comments: []u32{}
|
|
||||||
) or { panic('Failed to create user1: ${err}') }
|
|
||||||
|
|
||||||
mut user2 := mydb.user.new(
|
|
||||||
name: 'User Two'
|
|
||||||
description: 'Second test user'
|
|
||||||
email: 'user2@test.com'
|
|
||||||
public_key: 'key2'
|
|
||||||
phone: '+1-222-222-2222'
|
|
||||||
address: 'Address 2'
|
|
||||||
avatar_url: 'avatar2.jpg'
|
|
||||||
bio: 'Bio 2'
|
|
||||||
timezone: 'America/New_York'
|
|
||||||
status: .pending
|
|
||||||
securitypolicy: 2
|
|
||||||
tags: 6
|
|
||||||
comments: []u32{}
|
|
||||||
) or { panic('Failed to create user2: ${err}') }
|
|
||||||
|
|
||||||
// Save both users
|
|
||||||
user1 = mydb.user.set(user1) or { panic('Failed to save user1: ${err}') }
|
|
||||||
user2 = mydb.user.set(user2) or { panic('Failed to save user2: ${err}') }
|
|
||||||
|
|
||||||
// List users
|
|
||||||
user_list := mydb.user.list() or { panic('Failed to list users: ${err}') }
|
|
||||||
|
|
||||||
// Should have 2 more users than initially
|
|
||||||
assert user_list.len == initial_count + 2
|
|
||||||
|
|
||||||
// Find our users in the list
|
|
||||||
mut found_user1 := false
|
|
||||||
mut found_user2 := false
|
|
||||||
|
|
||||||
for usr in user_list {
|
|
||||||
if usr.name == 'User One' {
|
|
||||||
found_user1 = true
|
|
||||||
assert usr.email == 'user1@test.com'
|
|
||||||
assert usr.status == .active
|
|
||||||
}
|
|
||||||
if usr.name == 'User Two' {
|
|
||||||
found_user2 = true
|
|
||||||
assert usr.email == 'user2@test.com'
|
|
||||||
assert usr.status == .pending
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
assert found_user1 == true
|
|
||||||
assert found_user2 == true
|
|
||||||
}
|
|
||||||
|
|
||||||
fn test_user_edge_cases() {
|
|
||||||
mut mydb := heromodels.new() or { panic('Failed to create DB: ${err}') }
|
|
||||||
|
|
||||||
// Test user with empty fields
|
|
||||||
mut empty_user := mydb.user.new(
|
|
||||||
name: ''
|
|
||||||
description: ''
|
|
||||||
email: ''
|
|
||||||
public_key: ''
|
|
||||||
phone: ''
|
|
||||||
address: ''
|
|
||||||
avatar_url: ''
|
|
||||||
bio: ''
|
|
||||||
timezone: ''
|
|
||||||
status: .inactive
|
|
||||||
securitypolicy: 0
|
|
||||||
tags: 0
|
|
||||||
comments: []u32{}
|
|
||||||
) or { panic('Failed to create empty user: ${err}') }
|
|
||||||
|
|
||||||
empty_user = mydb.user.set(empty_user) or { panic('Failed to save empty user: ${err}') }
|
|
||||||
|
|
||||||
retrieved_empty := mydb.user.get(empty_user.id) or { panic('Failed to get empty user: ${err}') }
|
|
||||||
assert retrieved_empty.name == ''
|
|
||||||
assert retrieved_empty.email == ''
|
|
||||||
assert retrieved_empty.phone == ''
|
|
||||||
assert retrieved_empty.status == .inactive
|
|
||||||
|
|
||||||
// Test user with very long fields
|
|
||||||
long_text := 'Very long text. '.repeat(100)
|
|
||||||
mut long_user := mydb.user.new(
|
|
||||||
name: long_text
|
|
||||||
description: long_text
|
|
||||||
email: 'very.long.email.address.that.might.be.too.long@example.com'
|
|
||||||
public_key: long_text
|
|
||||||
phone: '+1-555-' + '1234567890'.repeat(10)
|
|
||||||
address: long_text
|
|
||||||
avatar_url: 'https://example.com/' + 'path/'.repeat(50) + 'avatar.jpg'
|
|
||||||
bio: long_text
|
|
||||||
timezone: 'America/New_York'
|
|
||||||
status: .active
|
|
||||||
securitypolicy: 999
|
|
||||||
tags: 7
|
|
||||||
comments: []u32{}
|
|
||||||
) or { panic('Failed to create long user: ${err}') }
|
|
||||||
|
|
||||||
long_user = mydb.user.set(long_user) or { panic('Failed to save long user: ${err}') }
|
|
||||||
|
|
||||||
retrieved_long := mydb.user.get(long_user.id) or { panic('Failed to get long user: ${err}') }
|
|
||||||
assert retrieved_long.name == long_text
|
|
||||||
assert retrieved_long.description == long_text
|
|
||||||
assert retrieved_long.status == .active
|
|
||||||
}
|
|
||||||
Reference in New Issue
Block a user