Mock Data Task
AngusTester
efficiently generates test data in multiple formats to meet diverse testing scenario requirements.
Mock data functionality supports 7 mainstream data formats, helping you:
- Bulk generate massive test data
- Flexibly configure data structure and storage methods
- Seamlessly integrate with databases and file systems
- Parameterize support to directly reference generated data in tests
Primary Parameter List
Parameter | Type | Required | Constraints | Description |
---|---|---|---|---|
name | string | No | ≤200 characters | Data name (auto-set as script name) |
description | string | No | ≤800 characters | Detailed description (auto-set as script description) |
fields | array[object] | No | 1-200 items | Data field definitions (see details below) |
settings | object | No | - | Data generation settings (see details below) |
Complete configuration example:
yaml
mockData:
name: User Data
description: User account/password data in CSV format
fields:
- name: username
type: string
value: "@String(3,20)"
- name: password
type: string
value: "@String(6,32)"
settings:
format: CSV
rows: 100000
batchRows: 200
location: LOCAL
lineEnding: UNIT_LF
includeHeader: true
Field Definitions (fields)
Parameter | Type | Required | Constraints | Description |
---|---|---|---|---|
name | string | Yes | ≤200 characters | Field name |
type | enum | Yes | - | Field type: string , integer , boolean , number , object , array . Note: object /array only supported in JSON format |
value | string | No | ≤8192 characters | Field value (constant or mock function) |
Example configuration:
yaml
fields:
- name: username
type: string
value: "@String(3,20)"
- name: age
type: integer
value: "@Number(18,60)"
Generation Settings (settings)
Parameter | Type | Required | Constraints | Description |
---|---|---|---|---|
format | string | Yes | ≤40 characters | Data format: CSV , CUSTOM , EXCEL , JSON , SQL , TAB , XML |
rows | integer | Yes | 1-100 billion | Total rows to generate |
batchRows | integer | No | 1-10000 | Batch generation rows |
location | enum | Yes | - | Storage location: DATASPACE , DATASOURCE , LOCAL , PUSH_THIRD |
storeRequest | object | No | / | HTTP request config (required when storing to DATASPACE or PUSH_THIRD ). Note: Only POST method supported. See "Script Specifications" for HTTP request details. |
Plugin Parameters | object | No | / | Plugin-specific parameters (see "Plugin Parameters" below). |
Storage Location (location)
DATASPACE · Dataspace Storage
- Location: AngusTester space:
Data
→File Data
→Space
- Use Cases:
- Team collaboration/shared data
- Long-term test dataset storage
- Cross-test data reuse
- Access: Managed online via AngusTester
- Location: AngusTester space:
DATASOURCE · Database Storage
- Location: AngusTester built-in datasource:
Data
→Datasource Data
→Datasource
- Use Cases:
- Database stress test initialization
- Production environment data pre-fill
- Data-driven testing support
- Access: Direct DB client connection or via app
- Location: AngusTester built-in datasource:
LOCAL · Local File Storage
- Location: Execution node file system
- Default Path:
${AGENT_HOME}/data/exec/[executionID]/data.[format]
- Use Cases:
- Temporary test data generation
- Quick access to raw data files
- Large file processing scenarios
- Access: Direct download from execution page
PUSH_THIRD · HTTP API Push
- Transfer Modes:
- Smart Detection:
- ContentType=application/octet-stream
- FormData contains file type
- Performance Tip: Set
batchRows=1000
for text streaming - Use Cases:
- Data integration pipelines
- Third-party system integration
- Real-time data pipelines
Format-Specific Plugin Parameters (Plugin Parameters)
Special configuration parameters for each data format:
CSV Format Parameters
Parameter | Type | Required | Description |
---|---|---|---|
lineEnding | enum | Yes | Line ending: UNIT_LF /WINDOWS_CRLF |
includeHeader | boolean | Yes | Include header row |
Custom Format Parameters
Parameter | Type | Required | Description |
---|---|---|---|
escapeChar | char | No | Escape char (default \u0000) |
quoteChar | char | No | Quote char (default \u0000) |
separatorChar | char | No | Separator char (default ,) |
Excel Format Parameters
Parameter | Type | Required | Description |
---|---|---|---|
includeHeader | boolean | Yes | Include header row |
JSON Format Parameters
Parameter | Type | Required | Description |
---|---|---|---|
includeNull | boolean | No | Include null fields |
rowsToArray | boolean | No | Convert to array format |
SQL Format Parameters
Parameter | Type | Required | Description |
---|---|---|---|
tableName | string | Yes | Database table name |
batchInsert | boolean | No | Enable batch insert |
storeDatasource | object | Yes | Database config (see below) |
Database Connection Config:
yaml
storeDatasource:
type: MYSQL
driverClassName: com.mysql.cj.jdbc.Driver
jdbcUrl: jdbc:mysql://localhost:3306/db
username: user
password: pass
TAB Format Parameters
Parameter | Type | Required | Description |
---|---|---|---|
includeHeader | boolean | Yes | Include header row |
XML Format Parameters
Parameter | Type | Required | Description |
---|---|---|---|
rootElement | string | Yes | Root element name |
recordElement | string | Yes | Record element name |
Best Practice Scenarios
Scenario 1: CSV User Data Generation
yaml
settings:
format: CSV
rows: 50000
location: LOCAL
lineEnding: WINDOWS_CRLF
includeHeader: true
Scenario 2: JSON Test Data
yaml
settings:
format: JSON
rows: 20000
location: DATASPACE
includeNull: false
rowsToArray: true
Scenario 3: Direct Database Write
yaml
settings:
format: SQL
location: DATASOURCE
batchInsert: true
storeDatasource:
type: POSTGRES
jdbcUrl: jdbc:postgresql://dbserver:5432/testdb
username: admin
password: securePass123
Scenario 4: Push to Third-Party System
yaml
settings:
format: XML
location: PUSH_THIRD
rootElement: Users
recordElement: User
storeRequest:
url: https://api.example.com/upload
parameters:
- name: Content-Type
in: header
enabled: true
type: string
value: multipart/form-data
- name: Authorization
in: header
enabled: true
type: string
value: "Bearer {token}"