...
This commit is contained in:
181
lib/threefold/models_ledger/README_TESTS.md
Normal file
181
lib/threefold/models_ledger/README_TESTS.md
Normal file
@@ -0,0 +1,181 @@
|
||||
# Models Ledger Tests
|
||||
|
||||
This directory contains comprehensive tests for all models in the `models_ledger` module. The tests focus primarily on encoding/decoding functionality to ensure data integrity through the database layer.
|
||||
|
||||
## Test Structure
|
||||
|
||||
Each model has its own test file following the naming convention `{model}_test.v`:
|
||||
|
||||
- `account_test.v` - Tests for Account model with complex nested structures
|
||||
- `asset_test.v` - Tests for Asset model with metadata maps
|
||||
- `user_test.v` - Tests for User model with SecretBox arrays
|
||||
- `transaction_test.v` - Tests for Transaction model with signature arrays
|
||||
- `dnszone_test.v` - Tests for DNSZone model with DNS records and SOA records
|
||||
- `group_test.v` - Tests for Group model with configuration structs
|
||||
- `member_test.v` - Tests for Member model with enums
|
||||
- `notary_test.v` - Tests for Notary model
|
||||
- `signature_test.v` - Tests for Signature model
|
||||
- `userkvs_test.v` - Tests for UserKVS model
|
||||
- `userkvsitem_test.v` - Tests for UserKVSItem model
|
||||
- `models_test.v` - Integration tests for all models together
|
||||
|
||||
## Test Categories
|
||||
|
||||
### 1. Model Creation (`test_{model}_new`)
|
||||
Tests the creation of new model instances with all field types:
|
||||
- Required fields
|
||||
- Optional fields
|
||||
- Default values
|
||||
- Complex nested structures
|
||||
- Arrays and maps
|
||||
|
||||
### 2. Encoding/Decoding (`test_{model}_encoding_decoding`)
|
||||
**Primary focus** - Tests the serialization and deserialization:
|
||||
- Encoding to binary format using `dump(mut encoder.Encoder)`
|
||||
- Decoding from binary format using `load(mut decoder.Decoder)`
|
||||
- Field-by-field verification after roundtrip
|
||||
- Complex data structures (nested structs, arrays, maps)
|
||||
- Edge cases (empty data, large data)
|
||||
|
||||
### 3. Database Operations (`test_{model}_set_and_get`)
|
||||
Tests CRUD operations through the database layer:
|
||||
- Create and save (`set`)
|
||||
- Retrieve (`get`)
|
||||
- ID assignment
|
||||
- Data persistence through database roundtrip
|
||||
|
||||
### 4. Update Operations (`test_{model}_update`)
|
||||
Tests updating existing records:
|
||||
- Modify fields
|
||||
- Save changes
|
||||
- Verify updates persist
|
||||
- ID and created_at preservation
|
||||
|
||||
### 5. Existence and Deletion (`test_{model}_exist_and_delete`)
|
||||
Tests existence checking and deletion:
|
||||
- Check non-existent records
|
||||
- Create and verify existence
|
||||
- Delete records
|
||||
- Verify deletion
|
||||
|
||||
### 6. List Operations (`test_{model}_list`)
|
||||
Tests listing all records:
|
||||
- Initial empty state
|
||||
- Create multiple records
|
||||
- List and count verification
|
||||
- Find specific records in lists
|
||||
|
||||
### 7. Edge Cases (`test_{model}_edge_cases`)
|
||||
Tests boundary conditions:
|
||||
- Empty/minimal data
|
||||
- Very large data
|
||||
- Special characters
|
||||
- Unicode handling
|
||||
- Maximum array sizes
|
||||
|
||||
## Key Features Tested
|
||||
|
||||
### Encoding/Decoding Integrity
|
||||
- **Primitive types**: strings, integers, floats, booleans
|
||||
- **Arrays**: `[]u32`, `[]string`, `[]SecretBox`, etc.
|
||||
- **Maps**: `map[string]string` for metadata
|
||||
- **Enums**: All enum types with proper conversion
|
||||
- **Nested structs**: Complex hierarchical data
|
||||
- **Binary data**: Encrypted data in SecretBox
|
||||
|
||||
### Complex Data Structures
|
||||
- **Account**: Nested AccountPolicy and AccountAsset arrays
|
||||
- **DNSZone**: DNS records with different types and SOA records
|
||||
- **User**: Encrypted data arrays (userprofile, kyc)
|
||||
- **Transaction**: Signature arrays with timestamps
|
||||
- **Group**: Configuration structs with multiple fields
|
||||
|
||||
### Performance Considerations
|
||||
- Large array handling (1000+ elements)
|
||||
- Large binary data (10KB+ encrypted data)
|
||||
- Complex nested structures
|
||||
- Memory efficiency during encoding/decoding
|
||||
|
||||
## Running Tests
|
||||
|
||||
Run all model tests:
|
||||
```bash
|
||||
vtest ~/code/github/incubaid/herolib/lib/threefold/models_ledger/
|
||||
```
|
||||
|
||||
Run specific model tests:
|
||||
```bash
|
||||
vtest ~/code/github/incubaid/herolib/lib/threefold/models_ledger/account_test.v
|
||||
vtest ~/code/github/incubaid/herolib/lib/threefold/models_ledger/user_test.v
|
||||
```
|
||||
|
||||
Run integration tests:
|
||||
```bash
|
||||
vtest ~/code/github/incubaid/herolib/lib/threefold/models_ledger/models_test.v
|
||||
```
|
||||
|
||||
## Common Test Patterns
|
||||
|
||||
### Setup
|
||||
Each test file uses a common setup function:
|
||||
```v
|
||||
fn setup_test_db() !db.DB {
|
||||
return db.new(path: ':memory:')!
|
||||
}
|
||||
```
|
||||
|
||||
### Encoding/Decoding Pattern
|
||||
```v
|
||||
// Test encoding
|
||||
mut encoder_obj := encoder.encoder_new()
|
||||
original_object.dump(mut encoder_obj)!
|
||||
encoded_data := encoder_obj.data
|
||||
|
||||
// Test decoding
|
||||
mut decoder_obj := encoder.decoder_new(encoded_data)
|
||||
mut decoded_object := ObjectType{}
|
||||
object_db.load(mut decoded_object, mut decoder_obj)!
|
||||
|
||||
// Verify all fields match
|
||||
assert decoded_object.field == original_object.field
|
||||
```
|
||||
|
||||
### CRUD Pattern
|
||||
```v
|
||||
// Create
|
||||
mut object := object_db.new(args)!
|
||||
|
||||
// Save
|
||||
object = object_db.set(object)!
|
||||
assert object.id > 0
|
||||
|
||||
// Get
|
||||
retrieved := object_db.get(object.id)!
|
||||
assert retrieved.field == object.field
|
||||
|
||||
// Update
|
||||
object.field = new_value
|
||||
object = object_db.set(object)!
|
||||
|
||||
// Delete
|
||||
object_db.delete(object.id)!
|
||||
assert object_db.exist(object.id)! == false
|
||||
```
|
||||
|
||||
## Error Conditions
|
||||
|
||||
Tests verify proper error handling for:
|
||||
- Non-existent record retrieval
|
||||
- Invalid data encoding/decoding
|
||||
- Database constraint violations
|
||||
- Memory allocation failures
|
||||
|
||||
## Performance Metrics
|
||||
|
||||
Integration tests include performance verification:
|
||||
- Large data encoding/decoding speed
|
||||
- Memory usage with complex structures
|
||||
- Database roundtrip efficiency
|
||||
- Array and map handling performance
|
||||
|
||||
The tests ensure that all models can handle real-world usage scenarios with proper data integrity and performance characteristics.
|
||||
328
lib/threefold/models_ledger/account_test.v
Normal file
328
lib/threefold/models_ledger/account_test.v
Normal file
@@ -0,0 +1,328 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused test
|
||||
|
||||
module models_ledger
|
||||
|
||||
import freeflowuniverse.herolib.hero.db
|
||||
import freeflowuniverse.herolib.data.encoder
|
||||
|
||||
fn setup_test_db() !db.DB {
|
||||
return db.new(path: ':memory:')!
|
||||
}
|
||||
|
||||
// Test Account model CRUD operations with focus on encoding/decoding
|
||||
fn test_account_new() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut account_db := DBAccount{db: &mydb}
|
||||
|
||||
// Create test account with complex nested structures
|
||||
policy := AccountPolicyArg{
|
||||
policy_id: 1
|
||||
admins: [u32(1), 2, 3]
|
||||
min_signatures: 2
|
||||
limits: [
|
||||
AccountLimit{
|
||||
amount: 1000.0
|
||||
asset_id: 1
|
||||
period: .daily
|
||||
},
|
||||
AccountLimit{
|
||||
amount: 5000.0
|
||||
asset_id: 2
|
||||
period: .monthly
|
||||
}
|
||||
]
|
||||
whitelist_out: [u32(10), 20]
|
||||
whitelist_in: [u32(30), 40]
|
||||
lock_till: 1234567890
|
||||
admin_lock_type: .locked_till
|
||||
admin_lock_till: 1234567900
|
||||
admin_unlock: [u32(5), 6]
|
||||
admin_unlock_min_signature: 1
|
||||
clawback_accounts: [u32(7), 8]
|
||||
clawback_min_signatures: 2
|
||||
clawback_from: 1234567800
|
||||
clawback_to: 1234567950
|
||||
}
|
||||
|
||||
asset := AccountAsset{
|
||||
assetid: 1
|
||||
balance: 1500.0
|
||||
metadata: {'currency': 'USD', 'type': 'stable'}
|
||||
}
|
||||
|
||||
mut account := account_db.new(
|
||||
name: 'Test Account'
|
||||
description: 'A test account for unit testing'
|
||||
owner_id: 123
|
||||
location_id: 456
|
||||
accountpolicies: [policy]
|
||||
assets: [asset]
|
||||
assetid: 1
|
||||
last_activity: 1234567890
|
||||
administrators: [u32(1), 2, 3]
|
||||
)!
|
||||
|
||||
// Verify the account was created with correct values
|
||||
assert account.name == 'Test Account'
|
||||
assert account.description == 'A test account for unit testing'
|
||||
assert account.owner_id == 123
|
||||
assert account.location_id == 456
|
||||
assert account.accountpolicies.len == 1
|
||||
assert account.accountpolicies[0].policy_id == 1
|
||||
assert account.accountpolicies[0].admins.len == 3
|
||||
assert account.accountpolicies[0].limits.len == 2
|
||||
assert account.assets.len == 1
|
||||
assert account.assets[0].assetid == 1
|
||||
assert account.assets[0].balance == 1500.0
|
||||
assert account.administrators.len == 3
|
||||
}
|
||||
|
||||
fn test_account_encoding_decoding() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut account_db := DBAccount{db: &mydb}
|
||||
|
||||
// Create a complex account with all field types
|
||||
policy := AccountPolicyArg{
|
||||
policy_id: 1
|
||||
admins: [u32(1), 2, 3]
|
||||
min_signatures: 2
|
||||
limits: [
|
||||
AccountLimit{
|
||||
amount: 1000.0
|
||||
asset_id: 1
|
||||
period: .daily
|
||||
}
|
||||
]
|
||||
whitelist_out: [u32(10)]
|
||||
whitelist_in: [u32(30)]
|
||||
lock_till: 1234567890
|
||||
admin_lock_type: .locked
|
||||
admin_lock_till: 1234567900
|
||||
admin_unlock: [u32(5)]
|
||||
admin_unlock_min_signature: 1
|
||||
clawback_accounts: [u32(7)]
|
||||
clawback_min_signatures: 1
|
||||
clawback_from: 1234567800
|
||||
clawback_to: 1234567950
|
||||
}
|
||||
|
||||
asset := AccountAsset{
|
||||
assetid: 1
|
||||
balance: 1500.0
|
||||
metadata: {'currency': 'USD', 'type': 'stable', 'issuer': 'test'}
|
||||
}
|
||||
|
||||
mut original_account := account_db.new(
|
||||
name: 'Encoding Test Account'
|
||||
description: 'Testing encoding and decoding'
|
||||
owner_id: 999
|
||||
location_id: 888
|
||||
accountpolicies: [policy]
|
||||
assets: [asset]
|
||||
assetid: 1
|
||||
last_activity: 1234567890
|
||||
administrators: [u32(1), 2, 3, 4, 5]
|
||||
)!
|
||||
|
||||
// Test encoding
|
||||
mut encoder_obj := encoder.encoder_new()
|
||||
original_account.dump(mut encoder_obj)!
|
||||
encoded_data := encoder_obj.data
|
||||
|
||||
// Test decoding
|
||||
mut decoder_obj := encoder.decoder_new(encoded_data)
|
||||
mut decoded_account := Account{}
|
||||
account_db.load(mut decoded_account, mut decoder_obj)!
|
||||
|
||||
// Verify all fields match after encoding/decoding
|
||||
assert decoded_account.owner_id == original_account.owner_id
|
||||
assert decoded_account.location_id == original_account.location_id
|
||||
assert decoded_account.assetid == original_account.assetid
|
||||
assert decoded_account.last_activity == original_account.last_activity
|
||||
assert decoded_account.administrators.len == original_account.administrators.len
|
||||
assert decoded_account.administrators[0] == original_account.administrators[0]
|
||||
assert decoded_account.administrators[4] == original_account.administrators[4]
|
||||
|
||||
// Verify account policies
|
||||
assert decoded_account.accountpolicies.len == original_account.accountpolicies.len
|
||||
decoded_policy := decoded_account.accountpolicies[0]
|
||||
original_policy := original_account.accountpolicies[0]
|
||||
assert decoded_policy.policy_id == original_policy.policy_id
|
||||
assert decoded_policy.admins.len == original_policy.admins.len
|
||||
assert decoded_policy.min_signatures == original_policy.min_signatures
|
||||
assert decoded_policy.limits.len == original_policy.limits.len
|
||||
assert decoded_policy.limits[0].amount == original_policy.limits[0].amount
|
||||
assert decoded_policy.limits[0].asset_id == original_policy.limits[0].asset_id
|
||||
assert decoded_policy.limits[0].period == original_policy.limits[0].period
|
||||
|
||||
// Verify assets
|
||||
assert decoded_account.assets.len == original_account.assets.len
|
||||
decoded_asset := decoded_account.assets[0]
|
||||
original_asset := original_account.assets[0]
|
||||
assert decoded_asset.assetid == original_asset.assetid
|
||||
assert decoded_asset.balance == original_asset.balance
|
||||
assert decoded_asset.metadata.len == original_asset.metadata.len
|
||||
assert decoded_asset.metadata['currency'] == original_asset.metadata['currency']
|
||||
assert decoded_asset.metadata['type'] == original_asset.metadata['type']
|
||||
}
|
||||
|
||||
fn test_account_set_and_get() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut account_db := DBAccount{db: &mydb}
|
||||
|
||||
// Create account
|
||||
policy := AccountPolicyArg{
|
||||
policy_id: 1
|
||||
admins: [u32(1)]
|
||||
min_signatures: 1
|
||||
limits: []AccountLimit{}
|
||||
whitelist_out: []u32{}
|
||||
whitelist_in: []u32{}
|
||||
lock_till: 0
|
||||
admin_lock_type: .free
|
||||
admin_lock_till: 0
|
||||
admin_unlock: []u32{}
|
||||
admin_unlock_min_signature: 0
|
||||
clawback_accounts: []u32{}
|
||||
clawback_min_signatures: 0
|
||||
clawback_from: 0
|
||||
clawback_to: 0
|
||||
}
|
||||
|
||||
mut account := account_db.new(
|
||||
name: 'DB Test Account'
|
||||
description: 'Testing database operations'
|
||||
owner_id: 123
|
||||
location_id: 456
|
||||
accountpolicies: [policy]
|
||||
assets: []AccountAsset{}
|
||||
assetid: 1
|
||||
last_activity: 1234567890
|
||||
administrators: [u32(1), 2]
|
||||
)!
|
||||
|
||||
// Save the account
|
||||
account = account_db.set(account)!
|
||||
|
||||
// Verify ID was assigned
|
||||
assert account.id > 0
|
||||
original_id := account.id
|
||||
|
||||
// Retrieve the account
|
||||
retrieved_account := account_db.get(account.id)!
|
||||
|
||||
// Verify all fields match through the database roundtrip
|
||||
assert retrieved_account.id == original_id
|
||||
assert retrieved_account.name == 'DB Test Account'
|
||||
assert retrieved_account.description == 'Testing database operations'
|
||||
assert retrieved_account.owner_id == 123
|
||||
assert retrieved_account.location_id == 456
|
||||
assert retrieved_account.assetid == 1
|
||||
assert retrieved_account.last_activity == 1234567890
|
||||
assert retrieved_account.administrators.len == 2
|
||||
assert retrieved_account.administrators[0] == 1
|
||||
assert retrieved_account.administrators[1] == 2
|
||||
assert retrieved_account.accountpolicies.len == 1
|
||||
assert retrieved_account.accountpolicies[0].policy_id == 1
|
||||
}
|
||||
|
||||
fn test_account_delete_and_exist() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut account_db := DBAccount{db: &mydb}
|
||||
|
||||
// Create and save account
|
||||
mut account := account_db.new(
|
||||
name: 'To Be Deleted'
|
||||
description: 'This account will be deleted'
|
||||
owner_id: 999
|
||||
location_id: 888
|
||||
accountpolicies: []AccountPolicyArg{}
|
||||
assets: []AccountAsset{}
|
||||
assetid: 1
|
||||
last_activity: 1234567890
|
||||
administrators: []u32{}
|
||||
)!
|
||||
|
||||
account = account_db.set(account)!
|
||||
account_id := account.id
|
||||
|
||||
// Verify it exists
|
||||
exists_before := account_db.exist(account_id)!
|
||||
assert exists_before == true
|
||||
|
||||
// Delete the account
|
||||
account_db.delete(account_id)!
|
||||
|
||||
// Verify it no longer exists
|
||||
exists_after := account_db.exist(account_id)!
|
||||
assert exists_after == false
|
||||
|
||||
// Verify get fails
|
||||
if _ := account_db.get(account_id) {
|
||||
panic('Should not be able to get deleted account')
|
||||
}
|
||||
}
|
||||
|
||||
fn test_account_list() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut account_db := DBAccount{db: &mydb}
|
||||
|
||||
// Initially should be empty
|
||||
initial_list := account_db.list()!
|
||||
initial_count := initial_list.len
|
||||
|
||||
// Create multiple accounts
|
||||
mut account1 := account_db.new(
|
||||
name: 'Account 1'
|
||||
description: 'First account'
|
||||
owner_id: 1
|
||||
location_id: 1
|
||||
accountpolicies: []AccountPolicyArg{}
|
||||
assets: []AccountAsset{}
|
||||
assetid: 1
|
||||
last_activity: 1234567890
|
||||
administrators: []u32{}
|
||||
)!
|
||||
|
||||
mut account2 := account_db.new(
|
||||
name: 'Account 2'
|
||||
description: 'Second account'
|
||||
owner_id: 2
|
||||
location_id: 2
|
||||
accountpolicies: []AccountPolicyArg{}
|
||||
assets: []AccountAsset{}
|
||||
assetid: 2
|
||||
last_activity: 1234567891
|
||||
administrators: [u32(1)]
|
||||
)!
|
||||
|
||||
// Save both accounts
|
||||
account1 = account_db.set(account1)!
|
||||
account2 = account_db.set(account2)!
|
||||
|
||||
// List accounts
|
||||
account_list := account_db.list()!
|
||||
|
||||
// Should have 2 more accounts than initially
|
||||
assert account_list.len == initial_count + 2
|
||||
|
||||
// Find our accounts in the list
|
||||
mut found_account1 := false
|
||||
mut found_account2 := false
|
||||
|
||||
for acc in account_list {
|
||||
if acc.name == 'Account 1' {
|
||||
found_account1 = true
|
||||
assert acc.owner_id == 1
|
||||
assert acc.location_id == 1
|
||||
}
|
||||
if acc.name == 'Account 2' {
|
||||
found_account2 = true
|
||||
assert acc.owner_id == 2
|
||||
assert acc.administrators.len == 1
|
||||
}
|
||||
}
|
||||
|
||||
assert found_account1 == true
|
||||
assert found_account2 == true
|
||||
}
|
||||
@@ -64,13 +64,7 @@ pub fn (self Asset) dump(mut e encoder.Encoder) ! {
|
||||
e.add_u8(self.decimals)
|
||||
e.add_bool(self.is_frozen)
|
||||
|
||||
// metadata map
|
||||
e.add_int(self.metadata.len)
|
||||
for key, value in self.metadata {
|
||||
e.add_string(key)
|
||||
e.add_string(value)
|
||||
}
|
||||
|
||||
e.add_map_string(self.metadata)
|
||||
e.add_list_u32(self.administrators)
|
||||
e.add_u32(self.min_signatures)
|
||||
}
|
||||
@@ -83,15 +77,7 @@ fn (mut self DBAsset) load(mut o Asset, mut e encoder.Decoder) ! {
|
||||
o.decimals = e.get_u8()!
|
||||
o.is_frozen = e.get_bool()!
|
||||
|
||||
// metadata map
|
||||
metadata_len := e.get_int()!
|
||||
o.metadata = map[string]string{}
|
||||
for _ in 0 .. metadata_len {
|
||||
key := e.get_string()!
|
||||
value := e.get_string()!
|
||||
o.metadata[key] = value
|
||||
}
|
||||
|
||||
o.metadata = e.get_map_string()!
|
||||
o.administrators = e.get_list_u32()!
|
||||
o.min_signatures = e.get_u32()!
|
||||
}
|
||||
|
||||
371
lib/threefold/models_ledger/asset_test.v
Normal file
371
lib/threefold/models_ledger/asset_test.v
Normal file
@@ -0,0 +1,371 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused test
|
||||
|
||||
module models_ledger
|
||||
|
||||
import freeflowuniverse.herolib.hero.db
|
||||
import freeflowuniverse.herolib.data.encoder
|
||||
|
||||
fn test_asset_new() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut asset_db := DBAsset{db: &mydb}
|
||||
|
||||
// Test creating a new asset with all fields
|
||||
mut asset := asset_db.new(
|
||||
name: 'Test Token'
|
||||
description: 'A test token for unit testing'
|
||||
address: 'GBTC...XYZ123'
|
||||
asset_type: 'token'
|
||||
issuer: 1
|
||||
supply: 1000000.0
|
||||
decimals: 8
|
||||
is_frozen: false
|
||||
metadata: {'symbol': 'TEST', 'website': 'https://test.com'}
|
||||
administrators: [u32(1), 2, 3]
|
||||
min_signatures: 2
|
||||
)!
|
||||
|
||||
// Verify the asset was created with correct values
|
||||
assert asset.name == 'Test Token'
|
||||
assert asset.description == 'A test token for unit testing'
|
||||
assert asset.address == 'GBTC...XYZ123'
|
||||
assert asset.asset_type == 'token'
|
||||
assert asset.issuer == 1
|
||||
assert asset.supply == 1000000.0
|
||||
assert asset.decimals == 8
|
||||
assert asset.is_frozen == false
|
||||
assert asset.metadata.len == 2
|
||||
assert asset.metadata['symbol'] == 'TEST'
|
||||
assert asset.metadata['website'] == 'https://test.com'
|
||||
assert asset.administrators.len == 3
|
||||
assert asset.min_signatures == 2
|
||||
assert asset.id == 0 // Should be 0 before saving
|
||||
assert asset.updated_at > 0 // Should have timestamp
|
||||
}
|
||||
|
||||
fn test_asset_encoding_decoding() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut asset_db := DBAsset{db: &mydb}
|
||||
|
||||
// Create a complex asset with all field types
|
||||
mut original_asset := asset_db.new(
|
||||
name: 'Encoding Test Asset'
|
||||
description: 'Testing encoding and decoding functionality'
|
||||
address: 'GABCD...XYZ789'
|
||||
asset_type: 'nft'
|
||||
issuer: 999
|
||||
supply: 5000000.0
|
||||
decimals: 6
|
||||
is_frozen: true
|
||||
metadata: {
|
||||
'symbol': 'ETA'
|
||||
'website': 'https://example.com'
|
||||
'description': 'Extended test asset'
|
||||
'category': 'utility'
|
||||
'version': '1.0'
|
||||
}
|
||||
administrators: [u32(10), 20, 30, 40]
|
||||
min_signatures: 3
|
||||
)!
|
||||
|
||||
// Test encoding
|
||||
mut encoder_obj := encoder.encoder_new()
|
||||
original_asset.dump(mut encoder_obj)!
|
||||
encoded_data := encoder_obj.data
|
||||
|
||||
// Test decoding
|
||||
mut decoder_obj := encoder.decoder_new(encoded_data)
|
||||
mut decoded_asset := Asset{}
|
||||
asset_db.load(mut decoded_asset, mut decoder_obj)!
|
||||
|
||||
// Verify all fields match after encoding/decoding
|
||||
assert decoded_asset.address == original_asset.address
|
||||
assert decoded_asset.asset_type == original_asset.asset_type
|
||||
assert decoded_asset.issuer == original_asset.issuer
|
||||
assert decoded_asset.supply == original_asset.supply
|
||||
assert decoded_asset.decimals == original_asset.decimals
|
||||
assert decoded_asset.is_frozen == original_asset.is_frozen
|
||||
assert decoded_asset.min_signatures == original_asset.min_signatures
|
||||
|
||||
// Verify metadata map
|
||||
assert decoded_asset.metadata.len == original_asset.metadata.len
|
||||
for key, value in original_asset.metadata {
|
||||
assert decoded_asset.metadata[key] == value
|
||||
}
|
||||
|
||||
// Verify administrators list
|
||||
assert decoded_asset.administrators.len == original_asset.administrators.len
|
||||
for i, admin in original_asset.administrators {
|
||||
assert decoded_asset.administrators[i] == admin
|
||||
}
|
||||
}
|
||||
|
||||
fn test_asset_set_and_get() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut asset_db := DBAsset{db: &mydb}
|
||||
|
||||
// Create asset
|
||||
mut asset := asset_db.new(
|
||||
name: 'DB Test Asset'
|
||||
description: 'Testing database operations'
|
||||
address: 'GTEST...DB123'
|
||||
asset_type: 'token'
|
||||
issuer: 42
|
||||
supply: 100000.0
|
||||
decimals: 2
|
||||
is_frozen: false
|
||||
metadata: {'currency': 'EUR', 'region': 'EU'}
|
||||
administrators: [u32(5), 10]
|
||||
min_signatures: 1
|
||||
)!
|
||||
|
||||
// Save the asset
|
||||
asset = asset_db.set(asset)!
|
||||
|
||||
// Verify ID was assigned
|
||||
assert asset.id > 0
|
||||
original_id := asset.id
|
||||
|
||||
// Retrieve the asset
|
||||
retrieved_asset := asset_db.get(asset.id)!
|
||||
|
||||
// Verify all fields match through the database roundtrip
|
||||
assert retrieved_asset.id == original_id
|
||||
assert retrieved_asset.name == 'DB Test Asset'
|
||||
assert retrieved_asset.description == 'Testing database operations'
|
||||
assert retrieved_asset.address == 'GTEST...DB123'
|
||||
assert retrieved_asset.asset_type == 'token'
|
||||
assert retrieved_asset.issuer == 42
|
||||
assert retrieved_asset.supply == 100000.0
|
||||
assert retrieved_asset.decimals == 2
|
||||
assert retrieved_asset.is_frozen == false
|
||||
assert retrieved_asset.metadata.len == 2
|
||||
assert retrieved_asset.metadata['currency'] == 'EUR'
|
||||
assert retrieved_asset.metadata['region'] == 'EU'
|
||||
assert retrieved_asset.administrators.len == 2
|
||||
assert retrieved_asset.administrators[0] == 5
|
||||
assert retrieved_asset.administrators[1] == 10
|
||||
assert retrieved_asset.min_signatures == 1
|
||||
}
|
||||
|
||||
fn test_asset_update() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut asset_db := DBAsset{db: &mydb}
|
||||
|
||||
// Create and save an asset
|
||||
mut asset := asset_db.new(
|
||||
name: 'Original Asset'
|
||||
description: 'Original description'
|
||||
address: 'GORIG...123'
|
||||
asset_type: 'token'
|
||||
issuer: 1
|
||||
supply: 1000.0
|
||||
decimals: 8
|
||||
is_frozen: false
|
||||
metadata: {'version': '1.0'}
|
||||
administrators: [u32(1)]
|
||||
min_signatures: 1
|
||||
)!
|
||||
|
||||
asset = asset_db.set(asset)!
|
||||
original_id := asset.id
|
||||
original_created_at := asset.created_at
|
||||
|
||||
// Update the asset
|
||||
asset.name = 'Updated Asset'
|
||||
asset.description = 'Updated description'
|
||||
asset.supply = 2000.0
|
||||
asset.is_frozen = true
|
||||
asset.metadata = {'version': '2.0', 'status': 'updated'}
|
||||
asset.administrators = [u32(1), 2, 3]
|
||||
asset.min_signatures = 2
|
||||
|
||||
asset = asset_db.set(asset)!
|
||||
|
||||
// Verify ID remains the same
|
||||
assert asset.id == original_id
|
||||
assert asset.created_at == original_created_at
|
||||
|
||||
// Retrieve and verify updates
|
||||
updated_asset := asset_db.get(asset.id)!
|
||||
assert updated_asset.name == 'Updated Asset'
|
||||
assert updated_asset.description == 'Updated description'
|
||||
assert updated_asset.supply == 2000.0
|
||||
assert updated_asset.is_frozen == true
|
||||
assert updated_asset.metadata.len == 2
|
||||
assert updated_asset.metadata['version'] == '2.0'
|
||||
assert updated_asset.metadata['status'] == 'updated'
|
||||
assert updated_asset.administrators.len == 3
|
||||
assert updated_asset.min_signatures == 2
|
||||
}
|
||||
|
||||
fn test_asset_exist_and_delete() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut asset_db := DBAsset{db: &mydb}
|
||||
|
||||
// Test non-existent asset
|
||||
exists := asset_db.exist(999)!
|
||||
assert exists == false
|
||||
|
||||
// Create and save an asset
|
||||
mut asset := asset_db.new(
|
||||
name: 'To Be Deleted'
|
||||
description: 'This asset will be deleted'
|
||||
address: 'GDEL...123'
|
||||
asset_type: 'token'
|
||||
issuer: 1
|
||||
supply: 1.0
|
||||
decimals: 0
|
||||
is_frozen: false
|
||||
metadata: map[string]string{}
|
||||
administrators: []u32{}
|
||||
min_signatures: 0
|
||||
)!
|
||||
|
||||
asset = asset_db.set(asset)!
|
||||
asset_id := asset.id
|
||||
|
||||
// Test existing asset
|
||||
exists_after_save := asset_db.exist(asset_id)!
|
||||
assert exists_after_save == true
|
||||
|
||||
// Delete the asset
|
||||
asset_db.delete(asset_id)!
|
||||
|
||||
// Verify it no longer exists
|
||||
exists_after_delete := asset_db.exist(asset_id)!
|
||||
assert exists_after_delete == false
|
||||
|
||||
// Verify get fails
|
||||
if _ := asset_db.get(asset_id) {
|
||||
panic('Should not be able to get deleted asset')
|
||||
}
|
||||
}
|
||||
|
||||
fn test_asset_list() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut asset_db := DBAsset{db: &mydb}
|
||||
|
||||
// Initially should be empty
|
||||
initial_list := asset_db.list()!
|
||||
initial_count := initial_list.len
|
||||
|
||||
// Create multiple assets
|
||||
mut asset1 := asset_db.new(
|
||||
name: 'Asset 1'
|
||||
description: 'First asset'
|
||||
address: 'GFIRST...123'
|
||||
asset_type: 'token'
|
||||
issuer: 1
|
||||
supply: 1000.0
|
||||
decimals: 8
|
||||
is_frozen: false
|
||||
metadata: {'type': 'utility'}
|
||||
administrators: [u32(1)]
|
||||
min_signatures: 1
|
||||
)!
|
||||
|
||||
mut asset2 := asset_db.new(
|
||||
name: 'Asset 2'
|
||||
description: 'Second asset'
|
||||
address: 'GSECOND...456'
|
||||
asset_type: 'nft'
|
||||
issuer: 2
|
||||
supply: 100.0
|
||||
decimals: 0
|
||||
is_frozen: true
|
||||
metadata: {'type': 'collectible', 'rarity': 'rare'}
|
||||
administrators: [u32(1), 2]
|
||||
min_signatures: 2
|
||||
)!
|
||||
|
||||
// Save both assets
|
||||
asset1 = asset_db.set(asset1)!
|
||||
asset2 = asset_db.set(asset2)!
|
||||
|
||||
// List assets
|
||||
asset_list := asset_db.list()!
|
||||
|
||||
// Should have 2 more assets than initially
|
||||
assert asset_list.len == initial_count + 2
|
||||
|
||||
// Find our assets in the list
|
||||
mut found_asset1 := false
|
||||
mut found_asset2 := false
|
||||
|
||||
for ass in asset_list {
|
||||
if ass.name == 'Asset 1' {
|
||||
found_asset1 = true
|
||||
assert ass.asset_type == 'token'
|
||||
assert ass.is_frozen == false
|
||||
assert ass.metadata['type'] == 'utility'
|
||||
}
|
||||
if ass.name == 'Asset 2' {
|
||||
found_asset2 = true
|
||||
assert ass.asset_type == 'nft'
|
||||
assert ass.is_frozen == true
|
||||
assert ass.metadata['rarity'] == 'rare'
|
||||
}
|
||||
}
|
||||
|
||||
assert found_asset1 == true
|
||||
assert found_asset2 == true
|
||||
}
|
||||
|
||||
fn test_asset_edge_cases() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut asset_db := DBAsset{db: &mydb}
|
||||
|
||||
// Test empty/minimal asset
|
||||
mut minimal_asset := asset_db.new(
|
||||
name: ''
|
||||
description: ''
|
||||
address: ''
|
||||
asset_type: ''
|
||||
issuer: 0
|
||||
supply: 0.0
|
||||
decimals: 0
|
||||
is_frozen: false
|
||||
metadata: map[string]string{}
|
||||
administrators: []u32{}
|
||||
min_signatures: 0
|
||||
)!
|
||||
|
||||
minimal_asset = asset_db.set(minimal_asset)!
|
||||
retrieved_minimal := asset_db.get(minimal_asset.id)!
|
||||
|
||||
assert retrieved_minimal.name == ''
|
||||
assert retrieved_minimal.description == ''
|
||||
assert retrieved_minimal.address == ''
|
||||
assert retrieved_minimal.asset_type == ''
|
||||
assert retrieved_minimal.issuer == 0
|
||||
assert retrieved_minimal.supply == 0.0
|
||||
assert retrieved_minimal.metadata.len == 0
|
||||
assert retrieved_minimal.administrators.len == 0
|
||||
|
||||
// Test asset with large metadata map
|
||||
large_metadata := map[string]string{}
|
||||
for i in 0 .. 100 {
|
||||
large_metadata['key_${i}'] = 'value_${i}'
|
||||
}
|
||||
|
||||
mut large_asset := asset_db.new(
|
||||
name: 'Large Metadata Asset'
|
||||
description: 'Asset with large metadata'
|
||||
address: 'GLARGE...123'
|
||||
asset_type: 'token'
|
||||
issuer: 1
|
||||
supply: 1000.0
|
||||
decimals: 8
|
||||
is_frozen: false
|
||||
metadata: large_metadata
|
||||
administrators: []u32{}
|
||||
min_signatures: 0
|
||||
)!
|
||||
|
||||
large_asset = asset_db.set(large_asset)!
|
||||
retrieved_large := asset_db.get(large_asset.id)!
|
||||
|
||||
assert retrieved_large.metadata.len == 100
|
||||
assert retrieved_large.metadata['key_0'] == 'value_0'
|
||||
assert retrieved_large.metadata['key_99'] == 'value_99'
|
||||
}
|
||||
488
lib/threefold/models_ledger/dnszone_test.v
Normal file
488
lib/threefold/models_ledger/dnszone_test.v
Normal file
@@ -0,0 +1,488 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused test
|
||||
|
||||
module models_ledger
|
||||
|
||||
import freeflowuniverse.herolib.hero.db
|
||||
import freeflowuniverse.herolib.data.encoder
|
||||
|
||||
fn test_dnszone_new() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut dns_db := DBDNSZone{db: &mydb}
|
||||
|
||||
// Create test DNS zone with records and SOA
|
||||
dns_record1 := DNSRecord{
|
||||
subdomain: 'www'
|
||||
record_type: .a
|
||||
value: '192.168.1.1'
|
||||
priority: 0
|
||||
ttl: 3600
|
||||
is_active: true
|
||||
cat: .ipv4
|
||||
is_wildcard: false
|
||||
}
|
||||
|
||||
dns_record2 := DNSRecord{
|
||||
subdomain: 'mail'
|
||||
record_type: .mx
|
||||
value: 'mail.example.com'
|
||||
priority: 10
|
||||
ttl: 3600
|
||||
is_active: true
|
||||
cat: .ipv4
|
||||
is_wildcard: false
|
||||
}
|
||||
|
||||
soa_record := SOARecord{
|
||||
zone_id: 1
|
||||
primary_ns: 'ns1.example.com'
|
||||
admin_email: 'admin@example.com'
|
||||
serial: 2023120101
|
||||
refresh: 3600
|
||||
retry: 1800
|
||||
expire: 604800
|
||||
minimum_ttl: 86400
|
||||
is_active: true
|
||||
}
|
||||
|
||||
mut dnszone := dns_db.new(
|
||||
name: 'Test DNS Zone'
|
||||
description: 'A test DNS zone for unit testing'
|
||||
domain: 'example.com'
|
||||
dnsrecords: [dns_record1, dns_record2]
|
||||
administrators: [u32(1), 2, 3]
|
||||
status: .active
|
||||
min_signatures: 2
|
||||
metadata: {'zone_type': 'primary', 'provider': 'test'}
|
||||
soarecord: [soa_record]
|
||||
)!
|
||||
|
||||
// Verify the DNS zone was created with correct values
|
||||
assert dnszone.name == 'Test DNS Zone'
|
||||
assert dnszone.description == 'A test DNS zone for unit testing'
|
||||
assert dnszone.domain == 'example.com'
|
||||
assert dnszone.dnsrecords.len == 2
|
||||
assert dnszone.dnsrecords[0].subdomain == 'www'
|
||||
assert dnszone.dnsrecords[1].record_type == .mx
|
||||
assert dnszone.administrators.len == 3
|
||||
assert dnszone.status == .active
|
||||
assert dnszone.min_signatures == 2
|
||||
assert dnszone.metadata.len == 2
|
||||
assert dnszone.soarecord.len == 1
|
||||
assert dnszone.soarecord[0].primary_ns == 'ns1.example.com'
|
||||
assert dnszone.id == 0 // Should be 0 before saving
|
||||
assert dnszone.updated_at > 0 // Should have timestamp
|
||||
}
|
||||
|
||||
fn test_dnszone_encoding_decoding() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut dns_db := DBDNSZone{db: &mydb}
|
||||
|
||||
// Create a complex DNS zone with multiple record types
|
||||
records := [
|
||||
DNSRecord{
|
||||
subdomain: '@'
|
||||
record_type: .a
|
||||
value: '203.0.113.1'
|
||||
priority: 0
|
||||
ttl: 300
|
||||
is_active: true
|
||||
cat: .ipv4
|
||||
is_wildcard: false
|
||||
},
|
||||
DNSRecord{
|
||||
subdomain: 'www'
|
||||
record_type: .cname
|
||||
value: 'example.com'
|
||||
priority: 0
|
||||
ttl: 3600
|
||||
is_active: true
|
||||
cat: .ipv4
|
||||
is_wildcard: false
|
||||
},
|
||||
DNSRecord{
|
||||
subdomain: '*'
|
||||
record_type: .a
|
||||
value: '203.0.113.2'
|
||||
priority: 0
|
||||
ttl: 3600
|
||||
is_active: false
|
||||
cat: .ipv4
|
||||
is_wildcard: true
|
||||
},
|
||||
DNSRecord{
|
||||
subdomain: 'ipv6'
|
||||
record_type: .aaaa
|
||||
value: '2001:db8::1'
|
||||
priority: 0
|
||||
ttl: 3600
|
||||
is_active: true
|
||||
cat: .ipv6
|
||||
is_wildcard: false
|
||||
}
|
||||
]
|
||||
|
||||
soa_records := [
|
||||
SOARecord{
|
||||
zone_id: 1
|
||||
primary_ns: 'ns1.test.com'
|
||||
admin_email: 'dns-admin@test.com'
|
||||
serial: 2023120201
|
||||
refresh: 7200
|
||||
retry: 3600
|
||||
expire: 1209600
|
||||
minimum_ttl: 300
|
||||
is_active: true
|
||||
},
|
||||
SOARecord{
|
||||
zone_id: 2
|
||||
primary_ns: 'ns2.test.com'
|
||||
admin_email: 'backup-admin@test.com'
|
||||
serial: 2023120202
|
||||
refresh: 7200
|
||||
retry: 3600
|
||||
expire: 1209600
|
||||
minimum_ttl: 300
|
||||
is_active: false
|
||||
}
|
||||
]
|
||||
|
||||
mut original_zone := dns_db.new(
|
||||
name: 'Encoding Test Zone'
|
||||
description: 'Testing encoding and decoding functionality'
|
||||
domain: 'test.com'
|
||||
dnsrecords: records
|
||||
administrators: [u32(10), 20, 30]
|
||||
status: .suspended
|
||||
min_signatures: 3
|
||||
metadata: {
|
||||
'zone_type': 'secondary'
|
||||
'provider': 'test_provider'
|
||||
'environment': 'staging'
|
||||
'auto_dnssec': 'true'
|
||||
}
|
||||
soarecord: soa_records
|
||||
)!
|
||||
|
||||
// Test encoding
|
||||
mut encoder_obj := encoder.encoder_new()
|
||||
original_zone.dump(mut encoder_obj)!
|
||||
encoded_data := encoder_obj.data
|
||||
|
||||
// Test decoding
|
||||
mut decoder_obj := encoder.decoder_new(encoded_data)
|
||||
mut decoded_zone := DNSZone{}
|
||||
dns_db.load(mut decoded_zone, mut decoder_obj)!
|
||||
|
||||
// Verify all fields match after encoding/decoding
|
||||
assert decoded_zone.domain == original_zone.domain
|
||||
assert decoded_zone.status == original_zone.status
|
||||
assert decoded_zone.min_signatures == original_zone.min_signatures
|
||||
|
||||
// Verify administrators
|
||||
assert decoded_zone.administrators.len == original_zone.administrators.len
|
||||
for i, admin in original_zone.administrators {
|
||||
assert decoded_zone.administrators[i] == admin
|
||||
}
|
||||
|
||||
// Verify metadata map
|
||||
assert decoded_zone.metadata.len == original_zone.metadata.len
|
||||
for key, value in original_zone.metadata {
|
||||
assert decoded_zone.metadata[key] == value
|
||||
}
|
||||
|
||||
// Verify DNS records
|
||||
assert decoded_zone.dnsrecords.len == original_zone.dnsrecords.len
|
||||
for i, record in original_zone.dnsrecords {
|
||||
decoded_record := decoded_zone.dnsrecords[i]
|
||||
assert decoded_record.subdomain == record.subdomain
|
||||
assert decoded_record.record_type == record.record_type
|
||||
assert decoded_record.value == record.value
|
||||
assert decoded_record.priority == record.priority
|
||||
assert decoded_record.ttl == record.ttl
|
||||
assert decoded_record.is_active == record.is_active
|
||||
assert decoded_record.cat == record.cat
|
||||
assert decoded_record.is_wildcard == record.is_wildcard
|
||||
}
|
||||
|
||||
// Verify SOA records
|
||||
assert decoded_zone.soarecord.len == original_zone.soarecord.len
|
||||
for i, soa in original_zone.soarecord {
|
||||
decoded_soa := decoded_zone.soarecord[i]
|
||||
assert decoded_soa.zone_id == soa.zone_id
|
||||
assert decoded_soa.primary_ns == soa.primary_ns
|
||||
assert decoded_soa.admin_email == soa.admin_email
|
||||
assert decoded_soa.serial == soa.serial
|
||||
assert decoded_soa.refresh == soa.refresh
|
||||
assert decoded_soa.retry == soa.retry
|
||||
assert decoded_soa.expire == soa.expire
|
||||
assert decoded_soa.minimum_ttl == soa.minimum_ttl
|
||||
assert decoded_soa.is_active == soa.is_active
|
||||
}
|
||||
}
|
||||
|
||||
fn test_dnszone_set_and_get() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut dns_db := DBDNSZone{db: &mydb}
|
||||
|
||||
// Create simple DNS zone
|
||||
record := DNSRecord{
|
||||
subdomain: 'api'
|
||||
record_type: .a
|
||||
value: '192.168.1.100'
|
||||
priority: 0
|
||||
ttl: 3600
|
||||
is_active: true
|
||||
cat: .ipv4
|
||||
is_wildcard: false
|
||||
}
|
||||
|
||||
mut dnszone := dns_db.new(
|
||||
name: 'DB Test Zone'
|
||||
description: 'Testing database operations'
|
||||
domain: 'dbtest.com'
|
||||
dnsrecords: [record]
|
||||
administrators: [u32(5)]
|
||||
status: .active
|
||||
min_signatures: 1
|
||||
metadata: {'test': 'true'}
|
||||
soarecord: []SOARecord{}
|
||||
)!
|
||||
|
||||
// Save the DNS zone
|
||||
dnszone = dns_db.set(dnszone)!
|
||||
|
||||
// Verify ID was assigned
|
||||
assert dnszone.id > 0
|
||||
original_id := dnszone.id
|
||||
|
||||
// Retrieve the DNS zone
|
||||
retrieved_zone := dns_db.get(dnszone.id)!
|
||||
|
||||
// Verify all fields match through the database roundtrip
|
||||
assert retrieved_zone.id == original_id
|
||||
assert retrieved_zone.name == 'DB Test Zone'
|
||||
assert retrieved_zone.description == 'Testing database operations'
|
||||
assert retrieved_zone.domain == 'dbtest.com'
|
||||
assert retrieved_zone.status == .active
|
||||
assert retrieved_zone.min_signatures == 1
|
||||
assert retrieved_zone.administrators.len == 1
|
||||
assert retrieved_zone.administrators[0] == 5
|
||||
assert retrieved_zone.metadata.len == 1
|
||||
assert retrieved_zone.metadata['test'] == 'true'
|
||||
assert retrieved_zone.dnsrecords.len == 1
|
||||
assert retrieved_zone.dnsrecords[0].subdomain == 'api'
|
||||
assert retrieved_zone.dnsrecords[0].value == '192.168.1.100'
|
||||
}
|
||||
|
||||
fn test_dnszone_update() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut dns_db := DBDNSZone{db: &mydb}
|
||||
|
||||
// Create and save a DNS zone
|
||||
mut dnszone := dns_db.new(
|
||||
name: 'Original Zone'
|
||||
description: 'Original description'
|
||||
domain: 'original.com'
|
||||
dnsrecords: []DNSRecord{}
|
||||
administrators: [u32(1)]
|
||||
status: .active
|
||||
min_signatures: 1
|
||||
metadata: {'version': '1.0'}
|
||||
soarecord: []SOARecord{}
|
||||
)!
|
||||
|
||||
dnszone = dns_db.set(dnszone)!
|
||||
original_id := dnszone.id
|
||||
original_created_at := dnszone.created_at
|
||||
|
||||
// Update the DNS zone
|
||||
new_record := DNSRecord{
|
||||
subdomain: 'updated'
|
||||
record_type: .a
|
||||
value: '10.0.0.1'
|
||||
priority: 0
|
||||
ttl: 300
|
||||
is_active: true
|
||||
cat: .ipv4
|
||||
is_wildcard: false
|
||||
}
|
||||
|
||||
dnszone.name = 'Updated Zone'
|
||||
dnszone.description = 'Updated description'
|
||||
dnszone.domain = 'updated.com'
|
||||
dnszone.dnsrecords = [new_record]
|
||||
dnszone.status = .suspended
|
||||
dnszone.metadata = {'version': '2.0', 'updated': 'true'}
|
||||
dnszone.min_signatures = 2
|
||||
|
||||
dnszone = dns_db.set(dnszone)!
|
||||
|
||||
// Verify ID remains the same
|
||||
assert dnszone.id == original_id
|
||||
assert dnszone.created_at == original_created_at
|
||||
|
||||
// Retrieve and verify updates
|
||||
updated_zone := dns_db.get(dnszone.id)!
|
||||
assert updated_zone.name == 'Updated Zone'
|
||||
assert updated_zone.description == 'Updated description'
|
||||
assert updated_zone.domain == 'updated.com'
|
||||
assert updated_zone.status == .suspended
|
||||
assert updated_zone.min_signatures == 2
|
||||
assert updated_zone.metadata.len == 2
|
||||
assert updated_zone.metadata['version'] == '2.0'
|
||||
assert updated_zone.dnsrecords.len == 1
|
||||
assert updated_zone.dnsrecords[0].subdomain == 'updated'
|
||||
}
|
||||
|
||||
fn test_dnszone_exist_and_delete() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut dns_db := DBDNSZone{db: &mydb}
|
||||
|
||||
// Test non-existent DNS zone
|
||||
exists := dns_db.exist(999)!
|
||||
assert exists == false
|
||||
|
||||
// Create and save a DNS zone
|
||||
mut dnszone := dns_db.new(
|
||||
name: 'To Be Deleted'
|
||||
description: 'This DNS zone will be deleted'
|
||||
domain: 'delete.com'
|
||||
dnsrecords: []DNSRecord{}
|
||||
administrators: []u32{}
|
||||
status: .archived
|
||||
min_signatures: 0
|
||||
metadata: map[string]string{}
|
||||
soarecord: []SOARecord{}
|
||||
)!
|
||||
|
||||
dnszone = dns_db.set(dnszone)!
|
||||
zone_id := dnszone.id
|
||||
|
||||
// Test existing DNS zone
|
||||
exists_after_save := dns_db.exist(zone_id)!
|
||||
assert exists_after_save == true
|
||||
|
||||
// Delete the DNS zone
|
||||
dns_db.delete(zone_id)!
|
||||
|
||||
// Verify it no longer exists
|
||||
exists_after_delete := dns_db.exist(zone_id)!
|
||||
assert exists_after_delete == false
|
||||
|
||||
// Verify get fails
|
||||
if _ := dns_db.get(zone_id) {
|
||||
panic('Should not be able to get deleted DNS zone')
|
||||
}
|
||||
}
|
||||
|
||||
fn test_dnszone_list() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut dns_db := DBDNSZone{db: &mydb}
|
||||
|
||||
// Initially should be empty
|
||||
initial_list := dns_db.list()!
|
||||
initial_count := initial_list.len
|
||||
|
||||
// Create multiple DNS zones
|
||||
mut zone1 := dns_db.new(
|
||||
name: 'Zone 1'
|
||||
description: 'First zone'
|
||||
domain: 'zone1.com'
|
||||
dnsrecords: []DNSRecord{}
|
||||
administrators: [u32(1)]
|
||||
status: .active
|
||||
min_signatures: 1
|
||||
metadata: {'type': 'primary'}
|
||||
soarecord: []SOARecord{}
|
||||
)!
|
||||
|
||||
mut zone2 := dns_db.new(
|
||||
name: 'Zone 2'
|
||||
description: 'Second zone'
|
||||
domain: 'zone2.net'
|
||||
dnsrecords: []DNSRecord{}
|
||||
administrators: [u32(1), 2]
|
||||
status: .suspended
|
||||
min_signatures: 2
|
||||
metadata: {'type': 'secondary'}
|
||||
soarecord: []SOARecord{}
|
||||
)!
|
||||
|
||||
// Save both zones
|
||||
zone1 = dns_db.set(zone1)!
|
||||
zone2 = dns_db.set(zone2)!
|
||||
|
||||
// List zones
|
||||
zone_list := dns_db.list()!
|
||||
|
||||
// Should have 2 more zones than initially
|
||||
assert zone_list.len == initial_count + 2
|
||||
|
||||
// Find our zones in the list
|
||||
mut found_zone1 := false
|
||||
mut found_zone2 := false
|
||||
|
||||
for zone in zone_list {
|
||||
if zone.domain == 'zone1.com' {
|
||||
found_zone1 = true
|
||||
assert zone.status == .active
|
||||
assert zone.metadata['type'] == 'primary'
|
||||
}
|
||||
if zone.domain == 'zone2.net' {
|
||||
found_zone2 = true
|
||||
assert zone.status == .suspended
|
||||
assert zone.administrators.len == 2
|
||||
}
|
||||
}
|
||||
|
||||
assert found_zone1 == true
|
||||
assert found_zone2 == true
|
||||
}
|
||||
|
||||
fn test_dnszone_record_types() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut dns_db := DBDNSZone{db: &mydb}
|
||||
|
||||
// Test all DNS record types
|
||||
record_types := [NameType.a, .aaaa, .cname, .mx, .txt, .srv, .ptr, .ns]
|
||||
record_cats := [NameCat.ipv4, .ipv6, .mycelium]
|
||||
|
||||
mut records := []DNSRecord{}
|
||||
for i, rtype in record_types {
|
||||
cat := record_cats[i % record_cats.len]
|
||||
records << DNSRecord{
|
||||
subdomain: 'test${i}'
|
||||
record_type: rtype
|
||||
value: 'value${i}'
|
||||
priority: u32(i)
|
||||
ttl: u32(300 + i * 100)
|
||||
is_active: i % 2 == 0
|
||||
cat: cat
|
||||
is_wildcard: i > 4
|
||||
}
|
||||
}
|
||||
|
||||
mut zone := dns_db.new(
|
||||
name: 'Record Types Test'
|
||||
description: 'Testing all record types'
|
||||
domain: 'recordtest.com'
|
||||
dnsrecords: records
|
||||
administrators: []u32{}
|
||||
status: .active
|
||||
min_signatures: 0
|
||||
metadata: map[string]string{}
|
||||
soarecord: []SOARecord{}
|
||||
)!
|
||||
|
||||
zone = dns_db.set(zone)!
|
||||
retrieved_zone := dns_db.get(zone.id)!
|
||||
|
||||
assert retrieved_zone.dnsrecords.len == record_types.len
|
||||
for i, record in retrieved_zone.dnsrecords {
|
||||
assert record.record_type == record_types[i]
|
||||
assert record.cat == record_cats[i % record_cats.len]
|
||||
assert record.subdomain == 'test${i}'
|
||||
assert record.is_active == (i % 2 == 0)
|
||||
assert record.is_wildcard == (i > 4)
|
||||
}
|
||||
}
|
||||
489
lib/threefold/models_ledger/group_test.v
Normal file
489
lib/threefold/models_ledger/group_test.v
Normal file
@@ -0,0 +1,489 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused test
|
||||
|
||||
module models_ledger
|
||||
|
||||
import freeflowuniverse.herolib.hero.db
|
||||
import freeflowuniverse.herolib.data.encoder
|
||||
|
||||
fn test_group_new() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut group_db := DBGroup{db: &mydb}
|
||||
|
||||
// Create test group with configuration
|
||||
config := GroupConfig{
|
||||
max_members: 100
|
||||
allow_guests: true
|
||||
auto_approve: false
|
||||
require_invite: true
|
||||
}
|
||||
|
||||
mut group := group_db.new(
|
||||
name: 'Test Group'
|
||||
description: 'A test group for unit testing'
|
||||
group_name: 'developers'
|
||||
dnsrecords: [u32(1), 2, 3]
|
||||
administrators: [u32(10), 20]
|
||||
min_signatures: 2
|
||||
config: config
|
||||
status: .active
|
||||
visibility: .private
|
||||
created: 1234567890
|
||||
updated: 1234567891
|
||||
)!
|
||||
|
||||
// Verify the group was created with correct values
|
||||
assert group.name == 'Test Group'
|
||||
assert group.description == 'A test group for unit testing'
|
||||
assert group.group_name == 'developers'
|
||||
assert group.dnsrecords.len == 3
|
||||
assert group.dnsrecords[0] == 1
|
||||
assert group.administrators.len == 2
|
||||
assert group.administrators[1] == 20
|
||||
assert group.min_signatures == 2
|
||||
assert group.config.max_members == 100
|
||||
assert group.config.allow_guests == true
|
||||
assert group.config.auto_approve == false
|
||||
assert group.config.require_invite == true
|
||||
assert group.status == .active
|
||||
assert group.visibility == .private
|
||||
assert group.created == 1234567890
|
||||
assert group.updated == 1234567891
|
||||
assert group.id == 0 // Should be 0 before saving
|
||||
assert group.updated_at > 0 // Should have timestamp
|
||||
}
|
||||
|
||||
fn test_group_encoding_decoding() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut group_db := DBGroup{db: &mydb}
|
||||
|
||||
// Create a complex group
|
||||
config := GroupConfig{
|
||||
max_members: 500
|
||||
allow_guests: false
|
||||
auto_approve: true
|
||||
require_invite: false
|
||||
}
|
||||
|
||||
mut original_group := group_db.new(
|
||||
name: 'Encoding Test Group'
|
||||
description: 'Testing encoding and decoding functionality'
|
||||
group_name: 'encoding_test_group'
|
||||
dnsrecords: [u32(100), 200, 300, 400, 500]
|
||||
administrators: [u32(1), 5, 10, 15, 20, 25]
|
||||
min_signatures: 3
|
||||
config: config
|
||||
status: .suspended
|
||||
visibility: .unlisted
|
||||
created: 1700000000
|
||||
updated: 1700000001
|
||||
)!
|
||||
|
||||
// Test encoding
|
||||
mut encoder_obj := encoder.encoder_new()
|
||||
original_group.dump(mut encoder_obj)!
|
||||
encoded_data := encoder_obj.data
|
||||
|
||||
// Test decoding
|
||||
mut decoder_obj := encoder.decoder_new(encoded_data)
|
||||
mut decoded_group := Group{}
|
||||
group_db.load(mut decoded_group, mut decoder_obj)!
|
||||
|
||||
// Verify all fields match after encoding/decoding
|
||||
assert decoded_group.group_name == original_group.group_name
|
||||
assert decoded_group.min_signatures == original_group.min_signatures
|
||||
assert decoded_group.status == original_group.status
|
||||
assert decoded_group.visibility == original_group.visibility
|
||||
assert decoded_group.created == original_group.created
|
||||
assert decoded_group.updated == original_group.updated
|
||||
|
||||
// Verify dnsrecords list
|
||||
assert decoded_group.dnsrecords.len == original_group.dnsrecords.len
|
||||
for i, record_id in original_group.dnsrecords {
|
||||
assert decoded_group.dnsrecords[i] == record_id
|
||||
}
|
||||
|
||||
// Verify administrators list
|
||||
assert decoded_group.administrators.len == original_group.administrators.len
|
||||
for i, admin in original_group.administrators {
|
||||
assert decoded_group.administrators[i] == admin
|
||||
}
|
||||
|
||||
// Verify config
|
||||
assert decoded_group.config.max_members == original_group.config.max_members
|
||||
assert decoded_group.config.allow_guests == original_group.config.allow_guests
|
||||
assert decoded_group.config.auto_approve == original_group.config.auto_approve
|
||||
assert decoded_group.config.require_invite == original_group.config.require_invite
|
||||
}
|
||||
|
||||
fn test_group_set_and_get() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut group_db := DBGroup{db: &mydb}
|
||||
|
||||
// Create group
|
||||
config := GroupConfig{
|
||||
max_members: 50
|
||||
allow_guests: true
|
||||
auto_approve: true
|
||||
require_invite: false
|
||||
}
|
||||
|
||||
mut group := group_db.new(
|
||||
name: 'DB Test Group'
|
||||
description: 'Testing database operations'
|
||||
group_name: 'db_test'
|
||||
dnsrecords: [u32(1)]
|
||||
administrators: [u32(5), 10]
|
||||
min_signatures: 1
|
||||
config: config
|
||||
status: .active
|
||||
visibility: .public
|
||||
created: 1234567890
|
||||
updated: 1234567890
|
||||
)!
|
||||
|
||||
// Save the group
|
||||
group = group_db.set(group)!
|
||||
|
||||
// Verify ID was assigned
|
||||
assert group.id > 0
|
||||
original_id := group.id
|
||||
|
||||
// Retrieve the group
|
||||
retrieved_group := group_db.get(group.id)!
|
||||
|
||||
// Verify all fields match through the database roundtrip
|
||||
assert retrieved_group.id == original_id
|
||||
assert retrieved_group.name == 'DB Test Group'
|
||||
assert retrieved_group.description == 'Testing database operations'
|
||||
assert retrieved_group.group_name == 'db_test'
|
||||
assert retrieved_group.status == .active
|
||||
assert retrieved_group.visibility == .public
|
||||
assert retrieved_group.min_signatures == 1
|
||||
assert retrieved_group.created == 1234567890
|
||||
assert retrieved_group.updated == 1234567890
|
||||
assert retrieved_group.dnsrecords.len == 1
|
||||
assert retrieved_group.dnsrecords[0] == 1
|
||||
assert retrieved_group.administrators.len == 2
|
||||
assert retrieved_group.administrators[0] == 5
|
||||
assert retrieved_group.administrators[1] == 10
|
||||
assert retrieved_group.config.max_members == 50
|
||||
assert retrieved_group.config.allow_guests == true
|
||||
}
|
||||
|
||||
fn test_group_update() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut group_db := DBGroup{db: &mydb}
|
||||
|
||||
// Create and save a group
|
||||
config := GroupConfig{
|
||||
max_members: 25
|
||||
allow_guests: false
|
||||
auto_approve: false
|
||||
require_invite: true
|
||||
}
|
||||
|
||||
mut group := group_db.new(
|
||||
name: 'Original Group'
|
||||
description: 'Original description'
|
||||
group_name: 'original'
|
||||
dnsrecords: []u32{}
|
||||
administrators: [u32(1)]
|
||||
min_signatures: 1
|
||||
config: config
|
||||
status: .active
|
||||
visibility: .private
|
||||
created: 1234567890
|
||||
updated: 1234567890
|
||||
)!
|
||||
|
||||
group = group_db.set(group)!
|
||||
original_id := group.id
|
||||
original_created_at := group.created_at
|
||||
|
||||
// Update the group
|
||||
new_config := GroupConfig{
|
||||
max_members: 200
|
||||
allow_guests: true
|
||||
auto_approve: true
|
||||
require_invite: false
|
||||
}
|
||||
|
||||
group.name = 'Updated Group'
|
||||
group.description = 'Updated description'
|
||||
group.group_name = 'updated'
|
||||
group.dnsrecords = [u32(10), 20, 30]
|
||||
group.administrators = [u32(1), 2, 3, 4]
|
||||
group.min_signatures = 3
|
||||
group.config = new_config
|
||||
group.status = .inactive
|
||||
group.visibility = .public
|
||||
group.updated = 1234567999
|
||||
|
||||
group = group_db.set(group)!
|
||||
|
||||
// Verify ID remains the same
|
||||
assert group.id == original_id
|
||||
assert group.created_at == original_created_at
|
||||
|
||||
// Retrieve and verify updates
|
||||
updated_group := group_db.get(group.id)!
|
||||
assert updated_group.name == 'Updated Group'
|
||||
assert updated_group.description == 'Updated description'
|
||||
assert updated_group.group_name == 'updated'
|
||||
assert updated_group.status == .inactive
|
||||
assert updated_group.visibility == .public
|
||||
assert updated_group.min_signatures == 3
|
||||
assert updated_group.updated == 1234567999
|
||||
assert updated_group.dnsrecords.len == 3
|
||||
assert updated_group.administrators.len == 4
|
||||
assert updated_group.config.max_members == 200
|
||||
assert updated_group.config.auto_approve == true
|
||||
}
|
||||
|
||||
fn test_group_exist_and_delete() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut group_db := DBGroup{db: &mydb}
|
||||
|
||||
// Test non-existent group
|
||||
exists := group_db.exist(999)!
|
||||
assert exists == false
|
||||
|
||||
// Create and save a group
|
||||
config := GroupConfig{
|
||||
max_members: 10
|
||||
allow_guests: false
|
||||
auto_approve: false
|
||||
require_invite: true
|
||||
}
|
||||
|
||||
mut group := group_db.new(
|
||||
name: 'To Be Deleted'
|
||||
description: 'This group will be deleted'
|
||||
group_name: 'delete_me'
|
||||
dnsrecords: []u32{}
|
||||
administrators: []u32{}
|
||||
min_signatures: 0
|
||||
config: config
|
||||
status: .archived
|
||||
visibility: .private
|
||||
created: 1234567890
|
||||
updated: 1234567890
|
||||
)!
|
||||
|
||||
group = group_db.set(group)!
|
||||
group_id := group.id
|
||||
|
||||
// Test existing group
|
||||
exists_after_save := group_db.exist(group_id)!
|
||||
assert exists_after_save == true
|
||||
|
||||
// Delete the group
|
||||
group_db.delete(group_id)!
|
||||
|
||||
// Verify it no longer exists
|
||||
exists_after_delete := group_db.exist(group_id)!
|
||||
assert exists_after_delete == false
|
||||
|
||||
// Verify get fails
|
||||
if _ := group_db.get(group_id) {
|
||||
panic('Should not be able to get deleted group')
|
||||
}
|
||||
}
|
||||
|
||||
fn test_group_list() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut group_db := DBGroup{db: &mydb}
|
||||
|
||||
// Initially should be empty
|
||||
initial_list := group_db.list()!
|
||||
initial_count := initial_list.len
|
||||
|
||||
// Create multiple groups
|
||||
config1 := GroupConfig{
|
||||
max_members: 100
|
||||
allow_guests: true
|
||||
auto_approve: true
|
||||
require_invite: false
|
||||
}
|
||||
|
||||
config2 := GroupConfig{
|
||||
max_members: 50
|
||||
allow_guests: false
|
||||
auto_approve: false
|
||||
require_invite: true
|
||||
}
|
||||
|
||||
mut group1 := group_db.new(
|
||||
name: 'Group 1'
|
||||
description: 'First group'
|
||||
group_name: 'group1'
|
||||
dnsrecords: [u32(1)]
|
||||
administrators: [u32(1)]
|
||||
min_signatures: 1
|
||||
config: config1
|
||||
status: .active
|
||||
visibility: .public
|
||||
created: 1234567890
|
||||
updated: 1234567890
|
||||
)!
|
||||
|
||||
mut group2 := group_db.new(
|
||||
name: 'Group 2'
|
||||
description: 'Second group'
|
||||
group_name: 'group2'
|
||||
dnsrecords: [u32(1), 2]
|
||||
administrators: [u32(1), 2]
|
||||
min_signatures: 2
|
||||
config: config2
|
||||
status: .inactive
|
||||
visibility: .private
|
||||
created: 1234567891
|
||||
updated: 1234567891
|
||||
)!
|
||||
|
||||
// Save both groups
|
||||
group1 = group_db.set(group1)!
|
||||
group2 = group_db.set(group2)!
|
||||
|
||||
// List groups
|
||||
group_list := group_db.list()!
|
||||
|
||||
// Should have 2 more groups than initially
|
||||
assert group_list.len == initial_count + 2
|
||||
|
||||
// Find our groups in the list
|
||||
mut found_group1 := false
|
||||
mut found_group2 := false
|
||||
|
||||
for grp in group_list {
|
||||
if grp.group_name == 'group1' {
|
||||
found_group1 = true
|
||||
assert grp.status == .active
|
||||
assert grp.visibility == .public
|
||||
assert grp.config.max_members == 100
|
||||
}
|
||||
if grp.group_name == 'group2' {
|
||||
found_group2 = true
|
||||
assert grp.status == .inactive
|
||||
assert grp.visibility == .private
|
||||
assert grp.config.require_invite == true
|
||||
}
|
||||
}
|
||||
|
||||
assert found_group1 == true
|
||||
assert found_group2 == true
|
||||
}
|
||||
|
||||
fn test_group_status_and_visibility() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut group_db := DBGroup{db: &mydb}
|
||||
|
||||
// Test all status values
|
||||
statuses := [GroupStatus.active, .inactive, .suspended, .archived]
|
||||
visibilities := [Visibility.public, .private, .unlisted]
|
||||
|
||||
for i, status in statuses {
|
||||
visibility := visibilities[i % visibilities.len]
|
||||
|
||||
config := GroupConfig{
|
||||
max_members: u32(10 + i)
|
||||
allow_guests: i % 2 == 0
|
||||
auto_approve: i % 2 == 1
|
||||
require_invite: i % 3 == 0
|
||||
}
|
||||
|
||||
mut group := group_db.new(
|
||||
name: 'Status Test Group ${i}'
|
||||
description: 'Testing ${status} and ${visibility}'
|
||||
group_name: 'status_test_${i}'
|
||||
dnsrecords: []u32{}
|
||||
administrators: []u32{}
|
||||
min_signatures: 0
|
||||
config: config
|
||||
status: status
|
||||
visibility: visibility
|
||||
created: u64(1234567890 + i)
|
||||
updated: u64(1234567890 + i)
|
||||
)!
|
||||
|
||||
group = group_db.set(group)!
|
||||
retrieved_group := group_db.get(group.id)!
|
||||
|
||||
assert retrieved_group.status == status
|
||||
assert retrieved_group.visibility == visibility
|
||||
assert retrieved_group.config.max_members == u32(10 + i)
|
||||
}
|
||||
}
|
||||
|
||||
fn test_group_edge_cases() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut group_db := DBGroup{db: &mydb}
|
||||
|
||||
// Test minimal group
|
||||
minimal_config := GroupConfig{
|
||||
max_members: 0
|
||||
allow_guests: false
|
||||
auto_approve: false
|
||||
require_invite: false
|
||||
}
|
||||
|
||||
mut minimal_group := group_db.new(
|
||||
name: ''
|
||||
description: ''
|
||||
group_name: ''
|
||||
dnsrecords: []u32{}
|
||||
administrators: []u32{}
|
||||
min_signatures: 0
|
||||
config: minimal_config
|
||||
status: .active
|
||||
visibility: .public
|
||||
created: 0
|
||||
updated: 0
|
||||
)!
|
||||
|
||||
minimal_group = group_db.set(minimal_group)!
|
||||
retrieved_minimal := group_db.get(minimal_group.id)!
|
||||
|
||||
assert retrieved_minimal.name == ''
|
||||
assert retrieved_minimal.description == ''
|
||||
assert retrieved_minimal.group_name == ''
|
||||
assert retrieved_minimal.dnsrecords.len == 0
|
||||
assert retrieved_minimal.administrators.len == 0
|
||||
assert retrieved_minimal.config.max_members == 0
|
||||
|
||||
// Test group with large arrays
|
||||
large_dns_records := []u32{len: 1000, init: u32(index + 1)}
|
||||
large_administrators := []u32{len: 100, init: u32(index + 1000)}
|
||||
|
||||
large_config := GroupConfig{
|
||||
max_members: 99999
|
||||
allow_guests: true
|
||||
auto_approve: true
|
||||
require_invite: true
|
||||
}
|
||||
|
||||
mut large_group := group_db.new(
|
||||
name: 'Large Group'
|
||||
description: 'Group with large arrays'
|
||||
group_name: 'large_group'
|
||||
dnsrecords: large_dns_records
|
||||
administrators: large_administrators
|
||||
min_signatures: 50
|
||||
config: large_config
|
||||
status: .active
|
||||
visibility: .public
|
||||
created: 1234567890
|
||||
updated: 1234567890
|
||||
)!
|
||||
|
||||
large_group = group_db.set(large_group)!
|
||||
retrieved_large := group_db.get(large_group.id)!
|
||||
|
||||
assert retrieved_large.dnsrecords.len == 1000
|
||||
assert retrieved_large.administrators.len == 100
|
||||
assert retrieved_large.dnsrecords[0] == 1
|
||||
assert retrieved_large.dnsrecords[999] == 1000
|
||||
assert retrieved_large.administrators[0] == 1000
|
||||
assert retrieved_large.administrators[99] == 1099
|
||||
assert retrieved_large.config.max_members == 99999
|
||||
}
|
||||
179
lib/threefold/models_ledger/member_test.v
Normal file
179
lib/threefold/models_ledger/member_test.v
Normal file
@@ -0,0 +1,179 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused test
|
||||
|
||||
module models_ledger
|
||||
|
||||
import freeflowuniverse.herolib.hero.db
|
||||
import freeflowuniverse.herolib.data.encoder
|
||||
|
||||
fn test_member_new() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut member_db := DBMember{db: &mydb}
|
||||
|
||||
mut member := member_db.new(
|
||||
name: 'Test Member'
|
||||
description: 'A test member for unit testing'
|
||||
group_id: 1
|
||||
user_id: 10
|
||||
role: .admin
|
||||
status: .active
|
||||
)!
|
||||
|
||||
assert member.name == 'Test Member'
|
||||
assert member.description == 'A test member for unit testing'
|
||||
assert member.group_id == 1
|
||||
assert member.user_id == 10
|
||||
assert member.role == .admin
|
||||
assert member.status == .active
|
||||
assert member.join_date > 0
|
||||
assert member.last_activity > 0
|
||||
assert member.id == 0
|
||||
assert member.updated_at > 0
|
||||
}
|
||||
|
||||
fn test_member_encoding_decoding() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut member_db := DBMember{db: &mydb}
|
||||
|
||||
mut original_member := member_db.new(
|
||||
name: 'Encoding Test Member'
|
||||
description: 'Testing encoding and decoding'
|
||||
group_id: 999
|
||||
user_id: 888
|
||||
role: .moderator
|
||||
status: .suspended
|
||||
)!
|
||||
|
||||
// Test encoding
|
||||
mut encoder_obj := encoder.encoder_new()
|
||||
original_member.dump(mut encoder_obj)!
|
||||
encoded_data := encoder_obj.data
|
||||
|
||||
// Test decoding
|
||||
mut decoder_obj := encoder.decoder_new(encoded_data)
|
||||
mut decoded_member := Member{}
|
||||
member_db.load(mut decoded_member, mut decoder_obj)!
|
||||
|
||||
// Verify all fields match
|
||||
assert decoded_member.group_id == original_member.group_id
|
||||
assert decoded_member.user_id == original_member.user_id
|
||||
assert decoded_member.role == original_member.role
|
||||
assert decoded_member.status == original_member.status
|
||||
assert decoded_member.join_date == original_member.join_date
|
||||
assert decoded_member.last_activity == original_member.last_activity
|
||||
}
|
||||
|
||||
fn test_member_crud_operations() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut member_db := DBMember{db: &mydb}
|
||||
|
||||
// Create and save
|
||||
mut member := member_db.new(
|
||||
name: 'CRUD Test Member'
|
||||
description: 'Testing CRUD operations'
|
||||
group_id: 5
|
||||
user_id: 15
|
||||
role: .member
|
||||
status: .pending
|
||||
)!
|
||||
|
||||
member = member_db.set(member)!
|
||||
assert member.id > 0
|
||||
member_id := member.id
|
||||
|
||||
// Get
|
||||
retrieved := member_db.get(member_id)!
|
||||
assert retrieved.group_id == 5
|
||||
assert retrieved.user_id == 15
|
||||
assert retrieved.role == .member
|
||||
assert retrieved.status == .pending
|
||||
|
||||
// Update
|
||||
member.role = .owner
|
||||
member.status = .active
|
||||
member = member_db.set(member)!
|
||||
|
||||
updated := member_db.get(member_id)!
|
||||
assert updated.role == .owner
|
||||
assert updated.status == .active
|
||||
|
||||
// Exist
|
||||
exists := member_db.exist(member_id)!
|
||||
assert exists == true
|
||||
|
||||
// Delete
|
||||
member_db.delete(member_id)!
|
||||
exists_after := member_db.exist(member_id)!
|
||||
assert exists_after == false
|
||||
}
|
||||
|
||||
fn test_member_list() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut member_db := DBMember{db: &mydb}
|
||||
|
||||
initial_list := member_db.list()!
|
||||
initial_count := initial_list.len
|
||||
|
||||
// Create multiple members
|
||||
mut member1 := member_db.new(
|
||||
name: 'Member 1'
|
||||
description: 'First member'
|
||||
group_id: 1
|
||||
user_id: 1
|
||||
role: .admin
|
||||
status: .active
|
||||
)!
|
||||
|
||||
mut member2 := member_db.new(
|
||||
name: 'Member 2'
|
||||
description: 'Second member'
|
||||
group_id: 2
|
||||
user_id: 2
|
||||
role: .member
|
||||
status: .pending
|
||||
)!
|
||||
|
||||
member1 = member_db.set(member1)!
|
||||
member2 = member_db.set(member2)!
|
||||
|
||||
member_list := member_db.list()!
|
||||
assert member_list.len == initial_count + 2
|
||||
|
||||
mut found1 := false
|
||||
mut found2 := false
|
||||
for m in member_list {
|
||||
if m.name == 'Member 1' {
|
||||
found1 = true
|
||||
assert m.role == .admin
|
||||
}
|
||||
if m.name == 'Member 2' {
|
||||
found2 = true
|
||||
assert m.status == .pending
|
||||
}
|
||||
}
|
||||
assert found1 && found2
|
||||
}
|
||||
|
||||
fn test_member_roles_and_statuses() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut member_db := DBMember{db: &mydb}
|
||||
|
||||
roles := [MemberRole.member, .moderator, .admin, .owner]
|
||||
statuses := [MemberStatus.pending, .active, .suspended, .archived]
|
||||
|
||||
for i, role in roles {
|
||||
status := statuses[i % statuses.len]
|
||||
mut member := member_db.new(
|
||||
name: 'Role Test ${i}'
|
||||
description: 'Testing ${role} with ${status}'
|
||||
group_id: u32(i + 1)
|
||||
user_id: u32(i + 100)
|
||||
role: role
|
||||
status: status
|
||||
)!
|
||||
|
||||
member = member_db.set(member)!
|
||||
retrieved := member_db.get(member.id)!
|
||||
assert retrieved.role == role
|
||||
assert retrieved.status == status
|
||||
}
|
||||
}
|
||||
224
lib/threefold/models_ledger/models_test.v
Normal file
224
lib/threefold/models_ledger/models_test.v
Normal file
@@ -0,0 +1,224 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused test
|
||||
|
||||
module models_ledger
|
||||
|
||||
import freeflowuniverse.herolib.hero.db
|
||||
|
||||
// Test that all models can be created and their basic functionality works
|
||||
fn test_all_models_integration() {
|
||||
mut mydb := setup_test_db()!
|
||||
|
||||
// Initialize all model DBs
|
||||
mut account_db := DBAccount{db: &mydb}
|
||||
mut asset_db := DBAsset{db: &mydb}
|
||||
mut user_db := DBUser{db: &mydb}
|
||||
mut transaction_db := DBTransaction{db: &mydb}
|
||||
mut dnszone_db := DBDNSZone{db: &mydb}
|
||||
mut group_db := DBGroup{db: &mydb}
|
||||
mut member_db := DBMember{db: &mydb}
|
||||
mut notary_db := DBNotary{db: &mydb}
|
||||
mut signature_db := DBSignature{db: &mydb}
|
||||
mut userkvs_db := DBUserKVS{db: &mydb}
|
||||
mut userkvsitem_db := DBUserKVSItem{db: &mydb}
|
||||
|
||||
// Create one instance of each model to ensure they all work
|
||||
mut user := user_db.new(
|
||||
name: 'Integration Test User'
|
||||
description: 'User for integration testing'
|
||||
username: 'integrationuser'
|
||||
pubkey: 'ed25519_INTEGRATION_TEST'
|
||||
email: ['integration@test.com']
|
||||
status: .active
|
||||
userprofile: []SecretBox{}
|
||||
kyc: []SecretBox{}
|
||||
)!
|
||||
user = user_db.set(user)!
|
||||
|
||||
mut asset := asset_db.new(
|
||||
name: 'Integration Test Asset'
|
||||
description: 'Asset for integration testing'
|
||||
address: 'GINTEGRATION...TEST'
|
||||
asset_type: 'token'
|
||||
issuer: user.id
|
||||
supply: 1000000.0
|
||||
decimals: 8
|
||||
is_frozen: false
|
||||
metadata: {'test': 'integration'}
|
||||
administrators: [user.id]
|
||||
min_signatures: 1
|
||||
)!
|
||||
asset = asset_db.set(asset)!
|
||||
|
||||
mut account := account_db.new(
|
||||
name: 'Integration Test Account'
|
||||
description: 'Account for integration testing'
|
||||
owner_id: user.id
|
||||
location_id: 0
|
||||
accountpolicies: []AccountPolicyArg{}
|
||||
assets: []AccountAsset{}
|
||||
assetid: asset.id
|
||||
last_activity: 1234567890
|
||||
administrators: [user.id]
|
||||
)!
|
||||
account = account_db.set(account)!
|
||||
|
||||
mut transaction := transaction_db.new(
|
||||
name: 'Integration Test Transaction'
|
||||
description: 'Transaction for integration testing'
|
||||
txid: 12345
|
||||
source: account.id
|
||||
destination: account.id
|
||||
assetid: asset.id
|
||||
amount: 100.0
|
||||
timestamp: 1234567890
|
||||
status: 'pending'
|
||||
memo: 'Integration test'
|
||||
tx_type: .transfer
|
||||
signatures: []TransactionSignature{}
|
||||
)!
|
||||
transaction = transaction_db.set(transaction)!
|
||||
|
||||
mut dnszone := dnszone_db.new(
|
||||
name: 'Integration Test DNS Zone'
|
||||
description: 'DNS zone for integration testing'
|
||||
domain: 'integration.test'
|
||||
dnsrecords: []DNSRecord{}
|
||||
administrators: [user.id]
|
||||
status: .active
|
||||
min_signatures: 1
|
||||
metadata: {'test': 'integration'}
|
||||
soarecord: []SOARecord{}
|
||||
)!
|
||||
dnszone = dnszone_db.set(dnszone)!
|
||||
|
||||
mut group := group_db.new(
|
||||
name: 'Integration Test Group'
|
||||
description: 'Group for integration testing'
|
||||
group_name: 'integration_group'
|
||||
dnsrecords: [dnszone.id]
|
||||
administrators: [user.id]
|
||||
min_signatures: 1
|
||||
config: GroupConfig{
|
||||
max_members: 100
|
||||
allow_guests: true
|
||||
auto_approve: true
|
||||
require_invite: false
|
||||
}
|
||||
status: .active
|
||||
visibility: .public
|
||||
created: 1234567890
|
||||
updated: 1234567890
|
||||
)!
|
||||
group = group_db.set(group)!
|
||||
|
||||
mut member := member_db.new(
|
||||
name: 'Integration Test Member'
|
||||
description: 'Member for integration testing'
|
||||
group_id: group.id
|
||||
user_id: user.id
|
||||
role: .admin
|
||||
status: .active
|
||||
)!
|
||||
member = member_db.set(member)!
|
||||
|
||||
mut notary := notary_db.new(
|
||||
name: 'Integration Test Notary'
|
||||
description: 'Notary for integration testing'
|
||||
notary_id: 1
|
||||
pubkey: 'ed25519_NOTARY_INTEGRATION'
|
||||
address: 'TFT_INTEGRATION_NOTARY'
|
||||
is_active: true
|
||||
)!
|
||||
notary = notary_db.set(notary)!
|
||||
|
||||
mut signature := signature_db.new(
|
||||
name: 'Integration Test Signature'
|
||||
description: 'Signature for integration testing'
|
||||
signer_id: user.id
|
||||
tx_id: transaction.id
|
||||
signature: 'integration_signature_hex'
|
||||
)!
|
||||
signature = signature_db.set(signature)!
|
||||
|
||||
mut userkvs := userkvs_db.new(
|
||||
name: 'Integration Test KVS'
|
||||
description: 'KVS for integration testing'
|
||||
user_id: user.id
|
||||
)!
|
||||
userkvs = userkvs_db.set(userkvs)!
|
||||
|
||||
mut userkvsitem := userkvsitem_db.new(
|
||||
name: 'Integration Test KVS Item'
|
||||
description: 'KVS item for integration testing'
|
||||
kvs_id: userkvs.id
|
||||
key: 'integration_key'
|
||||
value: 'integration_value'
|
||||
)!
|
||||
userkvsitem = userkvsitem_db.set(userkvsitem)!
|
||||
|
||||
// Verify all objects were created successfully
|
||||
assert user.id > 0
|
||||
assert asset.id > 0
|
||||
assert account.id > 0
|
||||
assert transaction.id > 0
|
||||
assert dnszone.id > 0
|
||||
assert group.id > 0
|
||||
assert member.id > 0
|
||||
assert notary.id > 0
|
||||
assert signature.id > 0
|
||||
assert userkvs.id > 0
|
||||
assert userkvsitem.id > 0
|
||||
|
||||
// Verify relationships
|
||||
assert account.owner_id == user.id
|
||||
assert transaction.source == account.id
|
||||
assert transaction.assetid == asset.id
|
||||
assert member.group_id == group.id
|
||||
assert member.user_id == user.id
|
||||
assert signature.signer_id == user.id
|
||||
assert signature.tx_id == transaction.id
|
||||
assert userkvs.user_id == user.id
|
||||
assert userkvsitem.kvs_id == userkvs.id
|
||||
|
||||
println('✅ All models integration test passed!')
|
||||
}
|
||||
|
||||
fn test_encoding_decoding_performance() {
|
||||
mut mydb := setup_test_db()!
|
||||
|
||||
// Test encoding/decoding performance with a complex object
|
||||
mut user_db := DBUser{db: &mydb}
|
||||
|
||||
// Create a user with large encrypted data
|
||||
large_data := []u8{len: 10000, init: u8(index % 256)}
|
||||
large_nonce := []u8{len: 12, init: u8(index + 100)}
|
||||
|
||||
large_secrets := []SecretBox{len: 10, init: SecretBox{
|
||||
data: large_data
|
||||
nonce: large_nonce
|
||||
}}
|
||||
|
||||
mut user := user_db.new(
|
||||
name: 'Performance Test User'
|
||||
description: 'User for performance testing'
|
||||
username: 'perfuser'
|
||||
pubkey: 'ed25519_PERFORMANCE_TEST'
|
||||
email: ['perf@test.com', 'perf2@test.com', 'perf3@test.com']
|
||||
status: .active
|
||||
userprofile: large_secrets
|
||||
kyc: large_secrets
|
||||
)!
|
||||
|
||||
// Save and retrieve to test full encoding/decoding cycle
|
||||
user = user_db.set(user)!
|
||||
retrieved_user := user_db.get(user.id)!
|
||||
|
||||
// Verify the large data was preserved
|
||||
assert retrieved_user.userprofile.len == 10
|
||||
assert retrieved_user.kyc.len == 10
|
||||
assert retrieved_user.userprofile[0].data.len == 10000
|
||||
assert retrieved_user.userprofile[0].data[0] == 0
|
||||
assert retrieved_user.userprofile[0].data[9999] == 15 // 9999 % 256
|
||||
|
||||
println('✅ Encoding/decoding performance test passed!')
|
||||
}
|
||||
195
lib/threefold/models_ledger/notary_test.v
Normal file
195
lib/threefold/models_ledger/notary_test.v
Normal file
@@ -0,0 +1,195 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused test
|
||||
|
||||
module models_ledger
|
||||
|
||||
import freeflowuniverse.herolib.hero.db
|
||||
import freeflowuniverse.herolib.data.encoder
|
||||
|
||||
fn test_notary_new() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut notary_db := DBNotary{db: &mydb}
|
||||
|
||||
mut notary := notary_db.new(
|
||||
name: 'Test Notary'
|
||||
description: 'A test notary for unit testing'
|
||||
notary_id: 1
|
||||
pubkey: 'ed25519_ABCD1234567890EFGH'
|
||||
address: 'TFT_ADDRESS_XYZ123'
|
||||
is_active: true
|
||||
)!
|
||||
|
||||
assert notary.name == 'Test Notary'
|
||||
assert notary.description == 'A test notary for unit testing'
|
||||
assert notary.notary_id == 1
|
||||
assert notary.pubkey == 'ed25519_ABCD1234567890EFGH'
|
||||
assert notary.address == 'TFT_ADDRESS_XYZ123'
|
||||
assert notary.is_active == true
|
||||
assert notary.id == 0
|
||||
assert notary.updated_at > 0
|
||||
}
|
||||
|
||||
fn test_notary_encoding_decoding() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut notary_db := DBNotary{db: &mydb}
|
||||
|
||||
mut original_notary := notary_db.new(
|
||||
name: 'Encoding Test Notary'
|
||||
description: 'Testing encoding and decoding'
|
||||
notary_id: 999
|
||||
pubkey: 'ed25519_ENCODING_TEST_PUBKEY_123456789'
|
||||
address: 'TFT_ENCODING_ADDRESS_ABCDEF'
|
||||
is_active: false
|
||||
)!
|
||||
|
||||
// Test encoding
|
||||
mut encoder_obj := encoder.encoder_new()
|
||||
original_notary.dump(mut encoder_obj)!
|
||||
encoded_data := encoder_obj.data
|
||||
|
||||
// Test decoding
|
||||
mut decoder_obj := encoder.decoder_new(encoded_data)
|
||||
mut decoded_notary := Notary{}
|
||||
notary_db.load(mut decoded_notary, mut decoder_obj)!
|
||||
|
||||
// Verify all fields match
|
||||
assert decoded_notary.notary_id == original_notary.notary_id
|
||||
assert decoded_notary.pubkey == original_notary.pubkey
|
||||
assert decoded_notary.address == original_notary.address
|
||||
assert decoded_notary.is_active == original_notary.is_active
|
||||
}
|
||||
|
||||
fn test_notary_crud_operations() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut notary_db := DBNotary{db: &mydb}
|
||||
|
||||
// Create and save
|
||||
mut notary := notary_db.new(
|
||||
name: 'CRUD Test Notary'
|
||||
description: 'Testing CRUD operations'
|
||||
notary_id: 5
|
||||
pubkey: 'ed25519_CRUD_TEST_PUBKEY'
|
||||
address: 'TFT_CRUD_ADDRESS'
|
||||
is_active: true
|
||||
)!
|
||||
|
||||
notary = notary_db.set(notary)!
|
||||
assert notary.id > 0
|
||||
notary_id := notary.id
|
||||
|
||||
// Get
|
||||
retrieved := notary_db.get(notary_id)!
|
||||
assert retrieved.notary_id == 5
|
||||
assert retrieved.pubkey == 'ed25519_CRUD_TEST_PUBKEY'
|
||||
assert retrieved.address == 'TFT_CRUD_ADDRESS'
|
||||
assert retrieved.is_active == true
|
||||
|
||||
// Update
|
||||
notary.is_active = false
|
||||
notary.address = 'TFT_UPDATED_ADDRESS'
|
||||
notary = notary_db.set(notary)!
|
||||
|
||||
updated := notary_db.get(notary_id)!
|
||||
assert updated.is_active == false
|
||||
assert updated.address == 'TFT_UPDATED_ADDRESS'
|
||||
|
||||
// Exist
|
||||
exists := notary_db.exist(notary_id)!
|
||||
assert exists == true
|
||||
|
||||
// Delete
|
||||
notary_db.delete(notary_id)!
|
||||
exists_after := notary_db.exist(notary_id)!
|
||||
assert exists_after == false
|
||||
}
|
||||
|
||||
fn test_notary_list() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut notary_db := DBNotary{db: &mydb}
|
||||
|
||||
initial_list := notary_db.list()!
|
||||
initial_count := initial_list.len
|
||||
|
||||
// Create multiple notaries
|
||||
mut notary1 := notary_db.new(
|
||||
name: 'Notary 1'
|
||||
description: 'First notary'
|
||||
notary_id: 1
|
||||
pubkey: 'ed25519_NOTARY1_PUBKEY'
|
||||
address: 'TFT_NOTARY1_ADDRESS'
|
||||
is_active: true
|
||||
)!
|
||||
|
||||
mut notary2 := notary_db.new(
|
||||
name: 'Notary 2'
|
||||
description: 'Second notary'
|
||||
notary_id: 2
|
||||
pubkey: 'ed25519_NOTARY2_PUBKEY'
|
||||
address: 'TFT_NOTARY2_ADDRESS'
|
||||
is_active: false
|
||||
)!
|
||||
|
||||
notary1 = notary_db.set(notary1)!
|
||||
notary2 = notary_db.set(notary2)!
|
||||
|
||||
notary_list := notary_db.list()!
|
||||
assert notary_list.len == initial_count + 2
|
||||
|
||||
mut found1 := false
|
||||
mut found2 := false
|
||||
for n in notary_list {
|
||||
if n.notary_id == 1 {
|
||||
found1 = true
|
||||
assert n.is_active == true
|
||||
}
|
||||
if n.notary_id == 2 {
|
||||
found2 = true
|
||||
assert n.is_active == false
|
||||
}
|
||||
}
|
||||
assert found1 && found2
|
||||
}
|
||||
|
||||
fn test_notary_edge_cases() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut notary_db := DBNotary{db: &mydb}
|
||||
|
||||
// Test empty strings
|
||||
mut minimal_notary := notary_db.new(
|
||||
name: ''
|
||||
description: ''
|
||||
notary_id: 0
|
||||
pubkey: ''
|
||||
address: ''
|
||||
is_active: false
|
||||
)!
|
||||
|
||||
minimal_notary = notary_db.set(minimal_notary)!
|
||||
retrieved := notary_db.get(minimal_notary.id)!
|
||||
|
||||
assert retrieved.name == ''
|
||||
assert retrieved.description == ''
|
||||
assert retrieved.notary_id == 0
|
||||
assert retrieved.pubkey == ''
|
||||
assert retrieved.address == ''
|
||||
assert retrieved.is_active == false
|
||||
|
||||
// Test long strings
|
||||
long_pubkey := 'ed25519_' + 'A'.repeat(1000)
|
||||
long_address := 'TFT_' + 'B'.repeat(1000)
|
||||
|
||||
mut long_notary := notary_db.new(
|
||||
name: 'Long String Notary'
|
||||
description: 'Testing with very long strings'
|
||||
notary_id: 999999
|
||||
pubkey: long_pubkey
|
||||
address: long_address
|
||||
is_active: true
|
||||
)!
|
||||
|
||||
long_notary = notary_db.set(long_notary)!
|
||||
retrieved_long := notary_db.get(long_notary.id)!
|
||||
|
||||
assert retrieved_long.pubkey == long_pubkey
|
||||
assert retrieved_long.address == long_address
|
||||
assert retrieved_long.notary_id == 999999
|
||||
}
|
||||
183
lib/threefold/models_ledger/signature_test.v
Normal file
183
lib/threefold/models_ledger/signature_test.v
Normal file
@@ -0,0 +1,183 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused test
|
||||
|
||||
module models_ledger
|
||||
|
||||
import freeflowuniverse.herolib.hero.db
|
||||
import freeflowuniverse.herolib.data.encoder
|
||||
|
||||
fn test_signature_new() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut sig_db := DBSignature{db: &mydb}
|
||||
|
||||
mut signature := sig_db.new(
|
||||
name: 'Test Signature'
|
||||
description: 'A test signature for unit testing'
|
||||
signer_id: 1
|
||||
tx_id: 123
|
||||
signature: 'abcd1234567890efgh'
|
||||
)!
|
||||
|
||||
assert signature.name == 'Test Signature'
|
||||
assert signature.description == 'A test signature for unit testing'
|
||||
assert signature.signer_id == 1
|
||||
assert signature.tx_id == 123
|
||||
assert signature.signature == 'abcd1234567890efgh'
|
||||
assert signature.timestamp > 0
|
||||
assert signature.id == 0
|
||||
assert signature.updated_at > 0
|
||||
}
|
||||
|
||||
fn test_signature_encoding_decoding() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut sig_db := DBSignature{db: &mydb}
|
||||
|
||||
mut original_sig := sig_db.new(
|
||||
name: 'Encoding Test Signature'
|
||||
description: 'Testing encoding and decoding'
|
||||
signer_id: 999
|
||||
tx_id: 888
|
||||
signature: 'hex_encoded_signature_123456789abcdef'
|
||||
)!
|
||||
|
||||
// Test encoding
|
||||
mut encoder_obj := encoder.encoder_new()
|
||||
original_sig.dump(mut encoder_obj)!
|
||||
encoded_data := encoder_obj.data
|
||||
|
||||
// Test decoding
|
||||
mut decoder_obj := encoder.decoder_new(encoded_data)
|
||||
mut decoded_sig := Signature{}
|
||||
sig_db.load(mut decoded_sig, mut decoder_obj)!
|
||||
|
||||
// Verify all fields match
|
||||
assert decoded_sig.signer_id == original_sig.signer_id
|
||||
assert decoded_sig.tx_id == original_sig.tx_id
|
||||
assert decoded_sig.signature == original_sig.signature
|
||||
assert decoded_sig.timestamp == original_sig.timestamp
|
||||
}
|
||||
|
||||
fn test_signature_crud_operations() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut sig_db := DBSignature{db: &mydb}
|
||||
|
||||
// Create and save
|
||||
mut signature := sig_db.new(
|
||||
name: 'CRUD Test Signature'
|
||||
description: 'Testing CRUD operations'
|
||||
signer_id: 5
|
||||
tx_id: 15
|
||||
signature: 'crud_test_signature_hex'
|
||||
)!
|
||||
|
||||
signature = sig_db.set(signature)!
|
||||
assert signature.id > 0
|
||||
sig_id := signature.id
|
||||
|
||||
// Get
|
||||
retrieved := sig_db.get(sig_id)!
|
||||
assert retrieved.signer_id == 5
|
||||
assert retrieved.tx_id == 15
|
||||
assert retrieved.signature == 'crud_test_signature_hex'
|
||||
|
||||
// Update
|
||||
signature.signature = 'updated_signature_hex'
|
||||
signature = sig_db.set(signature)!
|
||||
|
||||
updated := sig_db.get(sig_id)!
|
||||
assert updated.signature == 'updated_signature_hex'
|
||||
|
||||
// Exist
|
||||
exists := sig_db.exist(sig_id)!
|
||||
assert exists == true
|
||||
|
||||
// Delete
|
||||
sig_db.delete(sig_id)!
|
||||
exists_after := sig_db.exist(sig_id)!
|
||||
assert exists_after == false
|
||||
}
|
||||
|
||||
fn test_signature_list() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut sig_db := DBSignature{db: &mydb}
|
||||
|
||||
initial_list := sig_db.list()!
|
||||
initial_count := initial_list.len
|
||||
|
||||
// Create multiple signatures
|
||||
mut sig1 := sig_db.new(
|
||||
name: 'Signature 1'
|
||||
description: 'First signature'
|
||||
signer_id: 1
|
||||
tx_id: 101
|
||||
signature: 'signature1_hex'
|
||||
)!
|
||||
|
||||
mut sig2 := sig_db.new(
|
||||
name: 'Signature 2'
|
||||
description: 'Second signature'
|
||||
signer_id: 2
|
||||
tx_id: 102
|
||||
signature: 'signature2_hex'
|
||||
)!
|
||||
|
||||
sig1 = sig_db.set(sig1)!
|
||||
sig2 = sig_db.set(sig2)!
|
||||
|
||||
sig_list := sig_db.list()!
|
||||
assert sig_list.len == initial_count + 2
|
||||
|
||||
mut found1 := false
|
||||
mut found2 := false
|
||||
for s in sig_list {
|
||||
if s.signer_id == 1 {
|
||||
found1 = true
|
||||
assert s.tx_id == 101
|
||||
}
|
||||
if s.signer_id == 2 {
|
||||
found2 = true
|
||||
assert s.tx_id == 102
|
||||
}
|
||||
}
|
||||
assert found1 && found2
|
||||
}
|
||||
|
||||
fn test_signature_edge_cases() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut sig_db := DBSignature{db: &mydb}
|
||||
|
||||
// Test minimal signature
|
||||
mut minimal_sig := sig_db.new(
|
||||
name: ''
|
||||
description: ''
|
||||
signer_id: 0
|
||||
tx_id: 0
|
||||
signature: ''
|
||||
)!
|
||||
|
||||
minimal_sig = sig_db.set(minimal_sig)!
|
||||
retrieved := sig_db.get(minimal_sig.id)!
|
||||
|
||||
assert retrieved.name == ''
|
||||
assert retrieved.description == ''
|
||||
assert retrieved.signer_id == 0
|
||||
assert retrieved.tx_id == 0
|
||||
assert retrieved.signature == ''
|
||||
|
||||
// Test very long signature
|
||||
long_signature := 'A'.repeat(10000)
|
||||
|
||||
mut long_sig := sig_db.new(
|
||||
name: 'Long Signature'
|
||||
description: 'Testing with very long signature'
|
||||
signer_id: 999999
|
||||
tx_id: 888888
|
||||
signature: long_signature
|
||||
)!
|
||||
|
||||
long_sig = sig_db.set(long_sig)!
|
||||
retrieved_long := sig_db.get(long_sig.id)!
|
||||
|
||||
assert retrieved_long.signature == long_signature
|
||||
assert retrieved_long.signer_id == 999999
|
||||
assert retrieved_long.tx_id == 888888
|
||||
}
|
||||
7
lib/threefold/models_ledger/test_utils.v
Normal file
7
lib/threefold/models_ledger/test_utils.v
Normal file
@@ -0,0 +1,7 @@
|
||||
module models_ledger
|
||||
|
||||
import freeflowuniverse.herolib.hero.db
|
||||
|
||||
fn setup_test_db() !db.DB {
|
||||
return db.new(path: ':memory:')!
|
||||
}
|
||||
439
lib/threefold/models_ledger/transaction_test.v
Normal file
439
lib/threefold/models_ledger/transaction_test.v
Normal file
@@ -0,0 +1,439 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused test
|
||||
|
||||
module models_ledger
|
||||
|
||||
import freeflowuniverse.herolib.hero.db
|
||||
import freeflowuniverse.herolib.data.encoder
|
||||
|
||||
fn test_transaction_new() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut tx_db := DBTransaction{db: &mydb}
|
||||
|
||||
// Create test transaction with signatures
|
||||
sig1 := TransactionSignature{
|
||||
signer_id: 1
|
||||
signature: 'sig1_abcd123'
|
||||
timestamp: 1234567890
|
||||
}
|
||||
|
||||
sig2 := TransactionSignature{
|
||||
signer_id: 2
|
||||
signature: 'sig2_efgh456'
|
||||
timestamp: 1234567891
|
||||
}
|
||||
|
||||
mut transaction := tx_db.new(
|
||||
name: 'Test Transaction'
|
||||
description: 'A test transaction for unit testing'
|
||||
txid: 12345
|
||||
source: 100
|
||||
destination: 200
|
||||
assetid: 1
|
||||
amount: 500.75
|
||||
timestamp: 1234567890
|
||||
status: 'pending'
|
||||
memo: 'Test transfer'
|
||||
tx_type: .transfer
|
||||
signatures: [sig1, sig2]
|
||||
)!
|
||||
|
||||
// Verify the transaction was created with correct values
|
||||
assert transaction.name == 'Test Transaction'
|
||||
assert transaction.description == 'A test transaction for unit testing'
|
||||
assert transaction.txid == 12345
|
||||
assert transaction.source == 100
|
||||
assert transaction.destination == 200
|
||||
assert transaction.assetid == 1
|
||||
assert transaction.amount == 500.75
|
||||
assert transaction.timestamp == 1234567890
|
||||
assert transaction.status == 'pending'
|
||||
assert transaction.memo == 'Test transfer'
|
||||
assert transaction.tx_type == .transfer
|
||||
assert transaction.signatures.len == 2
|
||||
assert transaction.signatures[0].signer_id == 1
|
||||
assert transaction.signatures[1].signature == 'sig2_efgh456'
|
||||
assert transaction.id == 0 // Should be 0 before saving
|
||||
assert transaction.updated_at > 0 // Should have timestamp
|
||||
}
|
||||
|
||||
fn test_transaction_encoding_decoding() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut tx_db := DBTransaction{db: &mydb}
|
||||
|
||||
// Create a complex transaction with multiple signatures
|
||||
sigs := [
|
||||
TransactionSignature{
|
||||
signer_id: 10
|
||||
signature: 'complex_sig_1_abcdef123456'
|
||||
timestamp: 1234567800
|
||||
},
|
||||
TransactionSignature{
|
||||
signer_id: 20
|
||||
signature: 'complex_sig_2_ghijkl789012'
|
||||
timestamp: 1234567801
|
||||
},
|
||||
TransactionSignature{
|
||||
signer_id: 30
|
||||
signature: 'complex_sig_3_mnopqr345678'
|
||||
timestamp: 1234567802
|
||||
}
|
||||
]
|
||||
|
||||
mut original_tx := tx_db.new(
|
||||
name: 'Encoding Test Transaction'
|
||||
description: 'Testing encoding and decoding functionality'
|
||||
txid: 99999
|
||||
source: 999
|
||||
destination: 888
|
||||
assetid: 5
|
||||
amount: 12345.6789
|
||||
timestamp: 1234567890
|
||||
status: 'completed'
|
||||
memo: 'Complex transaction for encoding test with special chars: !@#$%^&*()'
|
||||
tx_type: .clawback
|
||||
signatures: sigs
|
||||
)!
|
||||
|
||||
// Test encoding
|
||||
mut encoder_obj := encoder.encoder_new()
|
||||
original_tx.dump(mut encoder_obj)!
|
||||
encoded_data := encoder_obj.data
|
||||
|
||||
// Test decoding
|
||||
mut decoder_obj := encoder.decoder_new(encoded_data)
|
||||
mut decoded_tx := Transaction{}
|
||||
tx_db.load(mut decoded_tx, mut decoder_obj)!
|
||||
|
||||
// Verify all fields match after encoding/decoding
|
||||
assert decoded_tx.txid == original_tx.txid
|
||||
assert decoded_tx.source == original_tx.source
|
||||
assert decoded_tx.destination == original_tx.destination
|
||||
assert decoded_tx.assetid == original_tx.assetid
|
||||
assert decoded_tx.amount == original_tx.amount
|
||||
assert decoded_tx.timestamp == original_tx.timestamp
|
||||
assert decoded_tx.status == original_tx.status
|
||||
assert decoded_tx.memo == original_tx.memo
|
||||
assert decoded_tx.tx_type == original_tx.tx_type
|
||||
|
||||
// Verify signatures array
|
||||
assert decoded_tx.signatures.len == original_tx.signatures.len
|
||||
for i, sig in original_tx.signatures {
|
||||
assert decoded_tx.signatures[i].signer_id == sig.signer_id
|
||||
assert decoded_tx.signatures[i].signature == sig.signature
|
||||
assert decoded_tx.signatures[i].timestamp == sig.timestamp
|
||||
}
|
||||
}
|
||||
|
||||
fn test_transaction_set_and_get() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut tx_db := DBTransaction{db: &mydb}
|
||||
|
||||
// Create transaction
|
||||
sig := TransactionSignature{
|
||||
signer_id: 5
|
||||
signature: 'db_test_sig_123'
|
||||
timestamp: 1234567890
|
||||
}
|
||||
|
||||
mut transaction := tx_db.new(
|
||||
name: 'DB Test Transaction'
|
||||
description: 'Testing database operations'
|
||||
txid: 555
|
||||
source: 111
|
||||
destination: 222
|
||||
assetid: 3
|
||||
amount: 100.50
|
||||
timestamp: 1234567890
|
||||
status: 'confirmed'
|
||||
memo: 'Database test transfer'
|
||||
tx_type: .transfer
|
||||
signatures: [sig]
|
||||
)!
|
||||
|
||||
// Save the transaction
|
||||
transaction = tx_db.set(transaction)!
|
||||
|
||||
// Verify ID was assigned
|
||||
assert transaction.id > 0
|
||||
original_id := transaction.id
|
||||
|
||||
// Retrieve the transaction
|
||||
retrieved_tx := tx_db.get(transaction.id)!
|
||||
|
||||
// Verify all fields match through the database roundtrip
|
||||
assert retrieved_tx.id == original_id
|
||||
assert retrieved_tx.name == 'DB Test Transaction'
|
||||
assert retrieved_tx.description == 'Testing database operations'
|
||||
assert retrieved_tx.txid == 555
|
||||
assert retrieved_tx.source == 111
|
||||
assert retrieved_tx.destination == 222
|
||||
assert retrieved_tx.assetid == 3
|
||||
assert retrieved_tx.amount == 100.50
|
||||
assert retrieved_tx.timestamp == 1234567890
|
||||
assert retrieved_tx.status == 'confirmed'
|
||||
assert retrieved_tx.memo == 'Database test transfer'
|
||||
assert retrieved_tx.tx_type == .transfer
|
||||
assert retrieved_tx.signatures.len == 1
|
||||
assert retrieved_tx.signatures[0].signer_id == 5
|
||||
assert retrieved_tx.signatures[0].signature == 'db_test_sig_123'
|
||||
}
|
||||
|
||||
fn test_transaction_update() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut tx_db := DBTransaction{db: &mydb}
|
||||
|
||||
// Create and save a transaction
|
||||
mut transaction := tx_db.new(
|
||||
name: 'Original Transaction'
|
||||
description: 'Original description'
|
||||
txid: 777
|
||||
source: 100
|
||||
destination: 200
|
||||
assetid: 1
|
||||
amount: 50.0
|
||||
timestamp: 1234567890
|
||||
status: 'pending'
|
||||
memo: 'Original memo'
|
||||
tx_type: .transfer
|
||||
signatures: []TransactionSignature{}
|
||||
)!
|
||||
|
||||
transaction = tx_db.set(transaction)!
|
||||
original_id := transaction.id
|
||||
original_created_at := transaction.created_at
|
||||
|
||||
// Update the transaction
|
||||
new_sig := TransactionSignature{
|
||||
signer_id: 999
|
||||
signature: 'updated_signature_xyz'
|
||||
timestamp: 1234567999
|
||||
}
|
||||
|
||||
transaction.name = 'Updated Transaction'
|
||||
transaction.description = 'Updated description'
|
||||
transaction.status = 'completed'
|
||||
transaction.memo = 'Updated memo'
|
||||
transaction.tx_type = .issue
|
||||
transaction.signatures = [new_sig]
|
||||
|
||||
transaction = tx_db.set(transaction)!
|
||||
|
||||
// Verify ID remains the same
|
||||
assert transaction.id == original_id
|
||||
assert transaction.created_at == original_created_at
|
||||
|
||||
// Retrieve and verify updates
|
||||
updated_tx := tx_db.get(transaction.id)!
|
||||
assert updated_tx.name == 'Updated Transaction'
|
||||
assert updated_tx.description == 'Updated description'
|
||||
assert updated_tx.status == 'completed'
|
||||
assert updated_tx.memo == 'Updated memo'
|
||||
assert updated_tx.tx_type == .issue
|
||||
assert updated_tx.signatures.len == 1
|
||||
assert updated_tx.signatures[0].signer_id == 999
|
||||
}
|
||||
|
||||
fn test_transaction_exist_and_delete() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut tx_db := DBTransaction{db: &mydb}
|
||||
|
||||
// Test non-existent transaction
|
||||
exists := tx_db.exist(999)!
|
||||
assert exists == false
|
||||
|
||||
// Create and save a transaction
|
||||
mut transaction := tx_db.new(
|
||||
name: 'To Be Deleted'
|
||||
description: 'This transaction will be deleted'
|
||||
txid: 666
|
||||
source: 100
|
||||
destination: 200
|
||||
assetid: 1
|
||||
amount: 1.0
|
||||
timestamp: 1234567890
|
||||
status: 'failed'
|
||||
memo: 'Delete me'
|
||||
tx_type: .burn
|
||||
signatures: []TransactionSignature{}
|
||||
)!
|
||||
|
||||
transaction = tx_db.set(transaction)!
|
||||
tx_id := transaction.id
|
||||
|
||||
// Test existing transaction
|
||||
exists_after_save := tx_db.exist(tx_id)!
|
||||
assert exists_after_save == true
|
||||
|
||||
// Delete the transaction
|
||||
tx_db.delete(tx_id)!
|
||||
|
||||
// Verify it no longer exists
|
||||
exists_after_delete := tx_db.exist(tx_id)!
|
||||
assert exists_after_delete == false
|
||||
|
||||
// Verify get fails
|
||||
if _ := tx_db.get(tx_id) {
|
||||
panic('Should not be able to get deleted transaction')
|
||||
}
|
||||
}
|
||||
|
||||
fn test_transaction_list() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut tx_db := DBTransaction{db: &mydb}
|
||||
|
||||
// Initially should be empty
|
||||
initial_list := tx_db.list()!
|
||||
initial_count := initial_list.len
|
||||
|
||||
// Create multiple transactions
|
||||
mut tx1 := tx_db.new(
|
||||
name: 'Transaction 1'
|
||||
description: 'First transaction'
|
||||
txid: 1001
|
||||
source: 1
|
||||
destination: 2
|
||||
assetid: 1
|
||||
amount: 100.0
|
||||
timestamp: 1234567890
|
||||
status: 'completed'
|
||||
memo: 'First'
|
||||
tx_type: .transfer
|
||||
signatures: []TransactionSignature{}
|
||||
)!
|
||||
|
||||
mut tx2 := tx_db.new(
|
||||
name: 'Transaction 2'
|
||||
description: 'Second transaction'
|
||||
txid: 1002
|
||||
source: 2
|
||||
destination: 3
|
||||
assetid: 2
|
||||
amount: 200.0
|
||||
timestamp: 1234567891
|
||||
status: 'pending'
|
||||
memo: 'Second'
|
||||
tx_type: .freeze
|
||||
signatures: []TransactionSignature{}
|
||||
)!
|
||||
|
||||
// Save both transactions
|
||||
tx1 = tx_db.set(tx1)!
|
||||
tx2 = tx_db.set(tx2)!
|
||||
|
||||
// List transactions
|
||||
tx_list := tx_db.list()!
|
||||
|
||||
// Should have 2 more transactions than initially
|
||||
assert tx_list.len == initial_count + 2
|
||||
|
||||
// Find our transactions in the list
|
||||
mut found_tx1 := false
|
||||
mut found_tx2 := false
|
||||
|
||||
for tx in tx_list {
|
||||
if tx.txid == 1001 {
|
||||
found_tx1 = true
|
||||
assert tx.status == 'completed'
|
||||
assert tx.tx_type == .transfer
|
||||
}
|
||||
if tx.txid == 1002 {
|
||||
found_tx2 = true
|
||||
assert tx.status == 'pending'
|
||||
assert tx.tx_type == .freeze
|
||||
}
|
||||
}
|
||||
|
||||
assert found_tx1 == true
|
||||
assert found_tx2 == true
|
||||
}
|
||||
|
||||
fn test_transaction_types() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut tx_db := DBTransaction{db: &mydb}
|
||||
|
||||
// Test all transaction types
|
||||
tx_types := [TransactionType.transfer, .clawback, .freeze, .unfreeze, .issue, .burn]
|
||||
|
||||
for i, tx_type in tx_types {
|
||||
mut tx := tx_db.new(
|
||||
name: 'Transaction Type Test ${i}'
|
||||
description: 'Testing ${tx_type}'
|
||||
txid: u32(2000 + i)
|
||||
source: 100
|
||||
destination: 200
|
||||
assetid: 1
|
||||
amount: f64(i + 1) * 10.0
|
||||
timestamp: 1234567890
|
||||
status: 'completed'
|
||||
memo: 'Type test for ${tx_type}'
|
||||
tx_type: tx_type
|
||||
signatures: []TransactionSignature{}
|
||||
)!
|
||||
|
||||
tx = tx_db.set(tx)!
|
||||
retrieved_tx := tx_db.get(tx.id)!
|
||||
assert retrieved_tx.tx_type == tx_type
|
||||
}
|
||||
}
|
||||
|
||||
fn test_transaction_edge_cases() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut tx_db := DBTransaction{db: &mydb}
|
||||
|
||||
// Test minimal transaction
|
||||
mut minimal_tx := tx_db.new(
|
||||
name: ''
|
||||
description: ''
|
||||
txid: 0
|
||||
source: 0
|
||||
destination: 0
|
||||
assetid: 0
|
||||
amount: 0.0
|
||||
timestamp: 0
|
||||
status: ''
|
||||
memo: ''
|
||||
tx_type: .transfer
|
||||
signatures: []TransactionSignature{}
|
||||
)!
|
||||
|
||||
minimal_tx = tx_db.set(minimal_tx)!
|
||||
retrieved_minimal := tx_db.get(minimal_tx.id)!
|
||||
|
||||
assert retrieved_minimal.name == ''
|
||||
assert retrieved_minimal.description == ''
|
||||
assert retrieved_minimal.txid == 0
|
||||
assert retrieved_minimal.amount == 0.0
|
||||
assert retrieved_minimal.status == ''
|
||||
assert retrieved_minimal.memo == ''
|
||||
assert retrieved_minimal.signatures.len == 0
|
||||
|
||||
// Test transaction with many signatures
|
||||
many_sigs := []TransactionSignature{len: 100, init: TransactionSignature{
|
||||
signer_id: u32(index + 1)
|
||||
signature: 'sig_${index}_abcd123'
|
||||
timestamp: u64(1234567890 + index)
|
||||
}}
|
||||
|
||||
mut large_tx := tx_db.new(
|
||||
name: 'Large Signature Transaction'
|
||||
description: 'Transaction with many signatures'
|
||||
txid: 9999
|
||||
source: 100
|
||||
destination: 200
|
||||
assetid: 1
|
||||
amount: 1000.0
|
||||
timestamp: 1234567890
|
||||
status: 'multisig'
|
||||
memo: 'Many signatures test'
|
||||
tx_type: .transfer
|
||||
signatures: many_sigs
|
||||
)!
|
||||
|
||||
large_tx = tx_db.set(large_tx)!
|
||||
retrieved_large := tx_db.get(large_tx.id)!
|
||||
|
||||
assert retrieved_large.signatures.len == 100
|
||||
assert retrieved_large.signatures[0].signer_id == 1
|
||||
assert retrieved_large.signatures[99].signer_id == 100
|
||||
assert retrieved_large.signatures[50].signature == 'sig_50_abcd123'
|
||||
}
|
||||
375
lib/threefold/models_ledger/user_test.v
Normal file
375
lib/threefold/models_ledger/user_test.v
Normal file
@@ -0,0 +1,375 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused test
|
||||
|
||||
module models_ledger
|
||||
|
||||
import freeflowuniverse.herolib.hero.db
|
||||
import freeflowuniverse.herolib.data.encoder
|
||||
|
||||
fn test_user_new() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut user_db := DBUser{db: &mydb}
|
||||
|
||||
// Create test user with encrypted data
|
||||
secret_profile := SecretBox{
|
||||
data: [u8(1), 2, 3, 4, 5]
|
||||
nonce: [u8(10), 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21]
|
||||
}
|
||||
|
||||
secret_kyc := SecretBox{
|
||||
data: [u8(100), 101, 102, 103]
|
||||
nonce: [u8(200), 201, 202, 203, 204, 205, 206, 207, 208, 209, 210, 211]
|
||||
}
|
||||
|
||||
mut user := user_db.new(
|
||||
name: 'Test User'
|
||||
description: 'A test user for unit testing'
|
||||
username: 'testuser123'
|
||||
pubkey: 'ed25519_ABCD...XYZ'
|
||||
email: ['test@example.com', 'test2@example.com']
|
||||
status: .active
|
||||
userprofile: [secret_profile]
|
||||
kyc: [secret_kyc]
|
||||
)!
|
||||
|
||||
// Verify the user was created with correct values
|
||||
assert user.name == 'Test User'
|
||||
assert user.description == 'A test user for unit testing'
|
||||
assert user.username == 'testuser123'
|
||||
assert user.pubkey == 'ed25519_ABCD...XYZ'
|
||||
assert user.email.len == 2
|
||||
assert user.email[0] == 'test@example.com'
|
||||
assert user.email[1] == 'test2@example.com'
|
||||
assert user.status == .active
|
||||
assert user.userprofile.len == 1
|
||||
assert user.kyc.len == 1
|
||||
assert user.id == 0 // Should be 0 before saving
|
||||
assert user.updated_at > 0 // Should have timestamp
|
||||
}
|
||||
|
||||
fn test_user_encoding_decoding() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut user_db := DBUser{db: &mydb}
|
||||
|
||||
// Create a complex user with multiple encrypted boxes
|
||||
profile1 := SecretBox{
|
||||
data: [u8(1), 2, 3, 4, 5, 6, 7, 8, 9, 10]
|
||||
nonce: [u8(10), 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21]
|
||||
}
|
||||
|
||||
profile2 := SecretBox{
|
||||
data: [u8(50), 51, 52, 53, 54]
|
||||
nonce: [u8(60), 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71]
|
||||
}
|
||||
|
||||
kyc1 := SecretBox{
|
||||
data: [u8(100), 101, 102, 103, 104, 105, 106]
|
||||
nonce: [u8(200), 201, 202, 203, 204, 205, 206, 207, 208, 209, 210, 211]
|
||||
}
|
||||
|
||||
mut original_user := user_db.new(
|
||||
name: 'Encoding Test User'
|
||||
description: 'Testing encoding and decoding functionality'
|
||||
username: 'encodeuser'
|
||||
pubkey: 'ed25519_ENCODE...TEST'
|
||||
email: ['encode@test.com', 'encode2@test.com', 'encode3@test.com']
|
||||
status: .suspended
|
||||
userprofile: [profile1, profile2]
|
||||
kyc: [kyc1]
|
||||
)!
|
||||
|
||||
// Test encoding
|
||||
mut encoder_obj := encoder.encoder_new()
|
||||
original_user.dump(mut encoder_obj)!
|
||||
encoded_data := encoder_obj.data
|
||||
|
||||
// Test decoding
|
||||
mut decoder_obj := encoder.decoder_new(encoded_data)
|
||||
mut decoded_user := User{}
|
||||
user_db.load(mut decoded_user, mut decoder_obj)!
|
||||
|
||||
// Verify all fields match after encoding/decoding
|
||||
assert decoded_user.username == original_user.username
|
||||
assert decoded_user.pubkey == original_user.pubkey
|
||||
assert decoded_user.status == original_user.status
|
||||
|
||||
// Verify email list
|
||||
assert decoded_user.email.len == original_user.email.len
|
||||
for i, email in original_user.email {
|
||||
assert decoded_user.email[i] == email
|
||||
}
|
||||
|
||||
// Verify userprofile SecretBox arrays
|
||||
assert decoded_user.userprofile.len == original_user.userprofile.len
|
||||
for i, profile in original_user.userprofile {
|
||||
assert decoded_user.userprofile[i].data.len == profile.data.len
|
||||
assert decoded_user.userprofile[i].nonce.len == profile.nonce.len
|
||||
for j, byte_val in profile.data {
|
||||
assert decoded_user.userprofile[i].data[j] == byte_val
|
||||
}
|
||||
for j, nonce_val in profile.nonce {
|
||||
assert decoded_user.userprofile[i].nonce[j] == nonce_val
|
||||
}
|
||||
}
|
||||
|
||||
// Verify kyc SecretBox arrays
|
||||
assert decoded_user.kyc.len == original_user.kyc.len
|
||||
for i, kyc_box in original_user.kyc {
|
||||
assert decoded_user.kyc[i].data.len == kyc_box.data.len
|
||||
assert decoded_user.kyc[i].nonce.len == kyc_box.nonce.len
|
||||
for j, byte_val in kyc_box.data {
|
||||
assert decoded_user.kyc[i].data[j] == byte_val
|
||||
}
|
||||
for j, nonce_val in kyc_box.nonce {
|
||||
assert decoded_user.kyc[i].nonce[j] == nonce_val
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn test_user_set_and_get() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut user_db := DBUser{db: &mydb}
|
||||
|
||||
// Create user
|
||||
mut user := user_db.new(
|
||||
name: 'DB Test User'
|
||||
description: 'Testing database operations'
|
||||
username: 'dbuser'
|
||||
pubkey: 'ed25519_DB...TEST'
|
||||
email: ['db@test.com']
|
||||
status: .active
|
||||
userprofile: []SecretBox{}
|
||||
kyc: []SecretBox{}
|
||||
)!
|
||||
|
||||
// Save the user
|
||||
user = user_db.set(user)!
|
||||
|
||||
// Verify ID was assigned
|
||||
assert user.id > 0
|
||||
original_id := user.id
|
||||
|
||||
// Retrieve the user
|
||||
retrieved_user := user_db.get(user.id)!
|
||||
|
||||
// Verify all fields match through the database roundtrip
|
||||
assert retrieved_user.id == original_id
|
||||
assert retrieved_user.name == 'DB Test User'
|
||||
assert retrieved_user.description == 'Testing database operations'
|
||||
assert retrieved_user.username == 'dbuser'
|
||||
assert retrieved_user.pubkey == 'ed25519_DB...TEST'
|
||||
assert retrieved_user.email.len == 1
|
||||
assert retrieved_user.email[0] == 'db@test.com'
|
||||
assert retrieved_user.status == .active
|
||||
assert retrieved_user.userprofile.len == 0
|
||||
assert retrieved_user.kyc.len == 0
|
||||
}
|
||||
|
||||
fn test_user_update() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut user_db := DBUser{db: &mydb}
|
||||
|
||||
// Create and save a user
|
||||
mut user := user_db.new(
|
||||
name: 'Original User'
|
||||
description: 'Original description'
|
||||
username: 'original'
|
||||
pubkey: 'ed25519_ORIG...123'
|
||||
email: ['orig@test.com']
|
||||
status: .active
|
||||
userprofile: []SecretBox{}
|
||||
kyc: []SecretBox{}
|
||||
)!
|
||||
|
||||
user = user_db.set(user)!
|
||||
original_id := user.id
|
||||
original_created_at := user.created_at
|
||||
|
||||
// Update the user
|
||||
new_profile := SecretBox{
|
||||
data: [u8(99), 98, 97]
|
||||
nonce: [u8(10), 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21]
|
||||
}
|
||||
|
||||
user.name = 'Updated User'
|
||||
user.description = 'Updated description'
|
||||
user.email = ['updated@test.com', 'updated2@test.com']
|
||||
user.status = .inactive
|
||||
user.userprofile = [new_profile]
|
||||
|
||||
user = user_db.set(user)!
|
||||
|
||||
// Verify ID remains the same
|
||||
assert user.id == original_id
|
||||
assert user.created_at == original_created_at
|
||||
|
||||
// Retrieve and verify updates
|
||||
updated_user := user_db.get(user.id)!
|
||||
assert updated_user.name == 'Updated User'
|
||||
assert updated_user.description == 'Updated description'
|
||||
assert updated_user.email.len == 2
|
||||
assert updated_user.email[0] == 'updated@test.com'
|
||||
assert updated_user.email[1] == 'updated2@test.com'
|
||||
assert updated_user.status == .inactive
|
||||
assert updated_user.userprofile.len == 1
|
||||
assert updated_user.userprofile[0].data.len == 3
|
||||
}
|
||||
|
||||
fn test_user_exist_and_delete() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut user_db := DBUser{db: &mydb}
|
||||
|
||||
// Test non-existent user
|
||||
exists := user_db.exist(999)!
|
||||
assert exists == false
|
||||
|
||||
// Create and save a user
|
||||
mut user := user_db.new(
|
||||
name: 'To Be Deleted'
|
||||
description: 'This user will be deleted'
|
||||
username: 'deleteme'
|
||||
pubkey: 'ed25519_DEL...123'
|
||||
email: ['delete@test.com']
|
||||
status: .archived
|
||||
userprofile: []SecretBox{}
|
||||
kyc: []SecretBox{}
|
||||
)!
|
||||
|
||||
user = user_db.set(user)!
|
||||
user_id := user.id
|
||||
|
||||
// Test existing user
|
||||
exists_after_save := user_db.exist(user_id)!
|
||||
assert exists_after_save == true
|
||||
|
||||
// Delete the user
|
||||
user_db.delete(user_id)!
|
||||
|
||||
// Verify it no longer exists
|
||||
exists_after_delete := user_db.exist(user_id)!
|
||||
assert exists_after_delete == false
|
||||
|
||||
// Verify get fails
|
||||
if _ := user_db.get(user_id) {
|
||||
panic('Should not be able to get deleted user')
|
||||
}
|
||||
}
|
||||
|
||||
fn test_user_list() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut user_db := DBUser{db: &mydb}
|
||||
|
||||
// Initially should be empty
|
||||
initial_list := user_db.list()!
|
||||
initial_count := initial_list.len
|
||||
|
||||
// Create multiple users
|
||||
mut user1 := user_db.new(
|
||||
name: 'User 1'
|
||||
description: 'First user'
|
||||
username: 'user1'
|
||||
pubkey: 'ed25519_USER1...123'
|
||||
email: ['user1@test.com']
|
||||
status: .active
|
||||
userprofile: []SecretBox{}
|
||||
kyc: []SecretBox{}
|
||||
)!
|
||||
|
||||
mut user2 := user_db.new(
|
||||
name: 'User 2'
|
||||
description: 'Second user'
|
||||
username: 'user2'
|
||||
pubkey: 'ed25519_USER2...456'
|
||||
email: ['user2@test.com', 'user2alt@test.com']
|
||||
status: .suspended
|
||||
userprofile: []SecretBox{}
|
||||
kyc: []SecretBox{}
|
||||
)!
|
||||
|
||||
// Save both users
|
||||
user1 = user_db.set(user1)!
|
||||
user2 = user_db.set(user2)!
|
||||
|
||||
// List users
|
||||
user_list := user_db.list()!
|
||||
|
||||
// Should have 2 more users than initially
|
||||
assert user_list.len == initial_count + 2
|
||||
|
||||
// Find our users in the list
|
||||
mut found_user1 := false
|
||||
mut found_user2 := false
|
||||
|
||||
for u in user_list {
|
||||
if u.username == 'user1' {
|
||||
found_user1 = true
|
||||
assert u.status == .active
|
||||
assert u.email.len == 1
|
||||
}
|
||||
if u.username == 'user2' {
|
||||
found_user2 = true
|
||||
assert u.status == .suspended
|
||||
assert u.email.len == 2
|
||||
}
|
||||
}
|
||||
|
||||
assert found_user1 == true
|
||||
assert found_user2 == true
|
||||
}
|
||||
|
||||
fn test_user_edge_cases() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut user_db := DBUser{db: &mydb}
|
||||
|
||||
// Test empty/minimal user
|
||||
mut minimal_user := user_db.new(
|
||||
name: ''
|
||||
description: ''
|
||||
username: ''
|
||||
pubkey: ''
|
||||
email: []string{}
|
||||
status: .active
|
||||
userprofile: []SecretBox{}
|
||||
kyc: []SecretBox{}
|
||||
)!
|
||||
|
||||
minimal_user = user_db.set(minimal_user)!
|
||||
retrieved_minimal := user_db.get(minimal_user.id)!
|
||||
|
||||
assert retrieved_minimal.name == ''
|
||||
assert retrieved_minimal.description == ''
|
||||
assert retrieved_minimal.username == ''
|
||||
assert retrieved_minimal.pubkey == ''
|
||||
assert retrieved_minimal.email.len == 0
|
||||
assert retrieved_minimal.userprofile.len == 0
|
||||
assert retrieved_minimal.kyc.len == 0
|
||||
|
||||
// Test user with large data arrays
|
||||
large_data := []u8{len: 1000, init: u8(index % 256)}
|
||||
large_nonce := []u8{len: 12, init: u8(index + 100)}
|
||||
|
||||
large_secret := SecretBox{
|
||||
data: large_data
|
||||
nonce: large_nonce
|
||||
}
|
||||
|
||||
mut large_user := user_db.new(
|
||||
name: 'Large Data User'
|
||||
description: 'User with large encrypted data'
|
||||
username: 'largeuser'
|
||||
pubkey: 'ed25519_LARGE...123'
|
||||
email: ['large@test.com']
|
||||
status: .active
|
||||
userprofile: [large_secret, large_secret] // Duplicate for testing
|
||||
kyc: [large_secret]
|
||||
)!
|
||||
|
||||
large_user = user_db.set(large_user)!
|
||||
retrieved_large := user_db.get(large_user.id)!
|
||||
|
||||
assert retrieved_large.userprofile.len == 2
|
||||
assert retrieved_large.kyc.len == 1
|
||||
assert retrieved_large.userprofile[0].data.len == 1000
|
||||
assert retrieved_large.userprofile[0].nonce.len == 12
|
||||
assert retrieved_large.userprofile[0].data[0] == 0
|
||||
assert retrieved_large.userprofile[0].data[999] == 231 // 999 % 256
|
||||
}
|
||||
127
lib/threefold/models_ledger/userkvs_test.v
Normal file
127
lib/threefold/models_ledger/userkvs_test.v
Normal file
@@ -0,0 +1,127 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused test
|
||||
|
||||
module models_ledger
|
||||
|
||||
import freeflowuniverse.herolib.hero.db
|
||||
import freeflowuniverse.herolib.data.encoder
|
||||
|
||||
fn test_userkvs_new() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut kvs_db := DBUserKVS{db: &mydb}
|
||||
|
||||
mut userkvs := kvs_db.new(
|
||||
name: 'Test User KVS'
|
||||
description: 'A test user KVS for unit testing'
|
||||
user_id: 1
|
||||
)!
|
||||
|
||||
assert userkvs.name == 'Test User KVS'
|
||||
assert userkvs.description == 'A test user KVS for unit testing'
|
||||
assert userkvs.user_id == 1
|
||||
assert userkvs.id == 0
|
||||
assert userkvs.updated_at > 0
|
||||
}
|
||||
|
||||
fn test_userkvs_encoding_decoding() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut kvs_db := DBUserKVS{db: &mydb}
|
||||
|
||||
mut original_kvs := kvs_db.new(
|
||||
name: 'Encoding Test KVS'
|
||||
description: 'Testing encoding and decoding'
|
||||
user_id: 999
|
||||
)!
|
||||
|
||||
// Test encoding
|
||||
mut encoder_obj := encoder.encoder_new()
|
||||
original_kvs.dump(mut encoder_obj)!
|
||||
encoded_data := encoder_obj.data
|
||||
|
||||
// Test decoding
|
||||
mut decoder_obj := encoder.decoder_new(encoded_data)
|
||||
mut decoded_kvs := UserKVS{}
|
||||
kvs_db.load(mut decoded_kvs, mut decoder_obj)!
|
||||
|
||||
// Verify all fields match
|
||||
assert decoded_kvs.user_id == original_kvs.user_id
|
||||
}
|
||||
|
||||
fn test_userkvs_crud_operations() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut kvs_db := DBUserKVS{db: &mydb}
|
||||
|
||||
// Create and save
|
||||
mut userkvs := kvs_db.new(
|
||||
name: 'CRUD Test KVS'
|
||||
description: 'Testing CRUD operations'
|
||||
user_id: 5
|
||||
)!
|
||||
|
||||
userkvs = kvs_db.set(userkvs)!
|
||||
assert userkvs.id > 0
|
||||
kvs_id := userkvs.id
|
||||
|
||||
// Get
|
||||
retrieved := kvs_db.get(kvs_id)!
|
||||
assert retrieved.user_id == 5
|
||||
assert retrieved.name == 'CRUD Test KVS'
|
||||
|
||||
// Update
|
||||
userkvs.name = 'Updated KVS'
|
||||
userkvs.description = 'Updated description'
|
||||
userkvs = kvs_db.set(userkvs)!
|
||||
|
||||
updated := kvs_db.get(kvs_id)!
|
||||
assert updated.name == 'Updated KVS'
|
||||
assert updated.description == 'Updated description'
|
||||
|
||||
// Exist
|
||||
exists := kvs_db.exist(kvs_id)!
|
||||
assert exists == true
|
||||
|
||||
// Delete
|
||||
kvs_db.delete(kvs_id)!
|
||||
exists_after := kvs_db.exist(kvs_id)!
|
||||
assert exists_after == false
|
||||
}
|
||||
|
||||
fn test_userkvs_list() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut kvs_db := DBUserKVS{db: &mydb}
|
||||
|
||||
initial_list := kvs_db.list()!
|
||||
initial_count := initial_list.len
|
||||
|
||||
// Create multiple KVS
|
||||
mut kvs1 := kvs_db.new(
|
||||
name: 'KVS 1'
|
||||
description: 'First KVS'
|
||||
user_id: 1
|
||||
)!
|
||||
|
||||
mut kvs2 := kvs_db.new(
|
||||
name: 'KVS 2'
|
||||
description: 'Second KVS'
|
||||
user_id: 2
|
||||
)!
|
||||
|
||||
kvs1 = kvs_db.set(kvs1)!
|
||||
kvs2 = kvs_db.set(kvs2)!
|
||||
|
||||
kvs_list := kvs_db.list()!
|
||||
assert kvs_list.len == initial_count + 2
|
||||
|
||||
mut found1 := false
|
||||
mut found2 := false
|
||||
for kvs in kvs_list {
|
||||
if kvs.user_id == 1 {
|
||||
found1 = true
|
||||
assert kvs.name == 'KVS 1'
|
||||
}
|
||||
if kvs.user_id == 2 {
|
||||
found2 = true
|
||||
assert kvs.name == 'KVS 2'
|
||||
}
|
||||
}
|
||||
assert found1 && found2
|
||||
}
|
||||
206
lib/threefold/models_ledger/userkvsitem_test.v
Normal file
206
lib/threefold/models_ledger/userkvsitem_test.v
Normal file
@@ -0,0 +1,206 @@
|
||||
#!/usr/bin/env -S v -n -w -cg -gc none -cc tcc -d use_openssl -enable-globals -no-skip-unused test
|
||||
|
||||
module models_ledger
|
||||
|
||||
import freeflowuniverse.herolib.hero.db
|
||||
import freeflowuniverse.herolib.data.encoder
|
||||
|
||||
fn test_userkvsitem_new() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut item_db := DBUserKVSItem{db: &mydb}
|
||||
|
||||
mut item := item_db.new(
|
||||
name: 'Test KVS Item'
|
||||
description: 'A test KVS item for unit testing'
|
||||
kvs_id: 1
|
||||
key: 'test_key'
|
||||
value: 'test_value'
|
||||
)!
|
||||
|
||||
assert item.name == 'Test KVS Item'
|
||||
assert item.description == 'A test KVS item for unit testing'
|
||||
assert item.kvs_id == 1
|
||||
assert item.key == 'test_key'
|
||||
assert item.value == 'test_value'
|
||||
assert item.timestamp > 0
|
||||
assert item.id == 0
|
||||
assert item.updated_at > 0
|
||||
}
|
||||
|
||||
fn test_userkvsitem_encoding_decoding() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut item_db := DBUserKVSItem{db: &mydb}
|
||||
|
||||
mut original_item := item_db.new(
|
||||
name: 'Encoding Test Item'
|
||||
description: 'Testing encoding and decoding'
|
||||
kvs_id: 999
|
||||
key: 'encoding_key'
|
||||
value: 'encoding_value_with_special_chars_!@#$%^&*()'
|
||||
)!
|
||||
|
||||
// Test encoding
|
||||
mut encoder_obj := encoder.encoder_new()
|
||||
original_item.dump(mut encoder_obj)!
|
||||
encoded_data := encoder_obj.data
|
||||
|
||||
// Test decoding
|
||||
mut decoder_obj := encoder.decoder_new(encoded_data)
|
||||
mut decoded_item := UserKVSItem{}
|
||||
item_db.load(mut decoded_item, mut decoder_obj)!
|
||||
|
||||
// Verify all fields match
|
||||
assert decoded_item.kvs_id == original_item.kvs_id
|
||||
assert decoded_item.key == original_item.key
|
||||
assert decoded_item.value == original_item.value
|
||||
assert decoded_item.timestamp == original_item.timestamp
|
||||
}
|
||||
|
||||
fn test_userkvsitem_crud_operations() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut item_db := DBUserKVSItem{db: &mydb}
|
||||
|
||||
// Create and save
|
||||
mut item := item_db.new(
|
||||
name: 'CRUD Test Item'
|
||||
description: 'Testing CRUD operations'
|
||||
kvs_id: 5
|
||||
key: 'crud_key'
|
||||
value: 'crud_value'
|
||||
)!
|
||||
|
||||
item = item_db.set(item)!
|
||||
assert item.id > 0
|
||||
item_id := item.id
|
||||
|
||||
// Get
|
||||
retrieved := item_db.get(item_id)!
|
||||
assert retrieved.kvs_id == 5
|
||||
assert retrieved.key == 'crud_key'
|
||||
assert retrieved.value == 'crud_value'
|
||||
|
||||
// Update
|
||||
item.key = 'updated_key'
|
||||
item.value = 'updated_value'
|
||||
item = item_db.set(item)!
|
||||
|
||||
updated := item_db.get(item_id)!
|
||||
assert updated.key == 'updated_key'
|
||||
assert updated.value == 'updated_value'
|
||||
|
||||
// Exist
|
||||
exists := item_db.exist(item_id)!
|
||||
assert exists == true
|
||||
|
||||
// Delete
|
||||
item_db.delete(item_id)!
|
||||
exists_after := item_db.exist(item_id)!
|
||||
assert exists_after == false
|
||||
}
|
||||
|
||||
fn test_userkvsitem_list() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut item_db := DBUserKVSItem{db: &mydb}
|
||||
|
||||
initial_list := item_db.list()!
|
||||
initial_count := initial_list.len
|
||||
|
||||
// Create multiple items
|
||||
mut item1 := item_db.new(
|
||||
name: 'Item 1'
|
||||
description: 'First item'
|
||||
kvs_id: 1
|
||||
key: 'key1'
|
||||
value: 'value1'
|
||||
)!
|
||||
|
||||
mut item2 := item_db.new(
|
||||
name: 'Item 2'
|
||||
description: 'Second item'
|
||||
kvs_id: 2
|
||||
key: 'key2'
|
||||
value: 'value2'
|
||||
)!
|
||||
|
||||
item1 = item_db.set(item1)!
|
||||
item2 = item_db.set(item2)!
|
||||
|
||||
item_list := item_db.list()!
|
||||
assert item_list.len == initial_count + 2
|
||||
|
||||
mut found1 := false
|
||||
mut found2 := false
|
||||
for item in item_list {
|
||||
if item.key == 'key1' {
|
||||
found1 = true
|
||||
assert item.kvs_id == 1
|
||||
assert item.value == 'value1'
|
||||
}
|
||||
if item.key == 'key2' {
|
||||
found2 = true
|
||||
assert item.kvs_id == 2
|
||||
assert item.value == 'value2'
|
||||
}
|
||||
}
|
||||
assert found1 && found2
|
||||
}
|
||||
|
||||
fn test_userkvsitem_edge_cases() {
|
||||
mut mydb := setup_test_db()!
|
||||
mut item_db := DBUserKVSItem{db: &mydb}
|
||||
|
||||
// Test empty strings
|
||||
mut minimal_item := item_db.new(
|
||||
name: ''
|
||||
description: ''
|
||||
kvs_id: 0
|
||||
key: ''
|
||||
value: ''
|
||||
)!
|
||||
|
||||
minimal_item = item_db.set(minimal_item)!
|
||||
retrieved := item_db.get(minimal_item.id)!
|
||||
|
||||
assert retrieved.name == ''
|
||||
assert retrieved.description == ''
|
||||
assert retrieved.kvs_id == 0
|
||||
assert retrieved.key == ''
|
||||
assert retrieved.value == ''
|
||||
|
||||
// Test large strings
|
||||
large_key := 'large_key_' + 'K'.repeat(1000)
|
||||
large_value := 'large_value_' + 'V'.repeat(10000)
|
||||
|
||||
mut large_item := item_db.new(
|
||||
name: 'Large Item'
|
||||
description: 'Testing with large strings'
|
||||
kvs_id: 999999
|
||||
key: large_key
|
||||
value: large_value
|
||||
)!
|
||||
|
||||
large_item = item_db.set(large_item)!
|
||||
retrieved_large := item_db.get(large_item.id)!
|
||||
|
||||
assert retrieved_large.key == large_key
|
||||
assert retrieved_large.value == large_value
|
||||
assert retrieved_large.kvs_id == 999999
|
||||
|
||||
// Test special characters
|
||||
special_key := 'key_with_unicode_🔑'
|
||||
special_value := 'value_with_unicode_💎_and_newlines\n\r\t'
|
||||
|
||||
mut special_item := item_db.new(
|
||||
name: 'Special Characters Item'
|
||||
description: 'Testing special characters'
|
||||
kvs_id: 1
|
||||
key: special_key
|
||||
value: special_value
|
||||
)!
|
||||
|
||||
special_item = item_db.set(special_item)!
|
||||
retrieved_special := item_db.get(special_item.id)!
|
||||
|
||||
assert retrieved_special.key == special_key
|
||||
assert retrieved_special.value == special_value
|
||||
}
|
||||
Reference in New Issue
Block a user