[WIP] workflow builder
This commit is contained in:
6
.gitmodules
vendored
Normal file
6
.gitmodules
vendored
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
[submodule "packs.external/python_example"]
|
||||||
|
path = packs.external/python_example
|
||||||
|
url = https://git.rdrx.app/attune-packs/python_example.git
|
||||||
|
[submodule "packs.external/nodejs_example"]
|
||||||
|
path = packs.external/nodejs_example
|
||||||
|
url = https://git.rdrx.app/attune-packs/nodejs_example.git
|
||||||
47
AGENTS.md
47
AGENTS.md
@@ -220,12 +220,47 @@ Enforcement created → Execution scheduled → Worker executes Action
|
|||||||
- Development packs in `./packs.dev/` are bind-mounted directly for instant updates
|
- Development packs in `./packs.dev/` are bind-mounted directly for instant updates
|
||||||
- **Pack Binaries**: Native binaries (sensors) built separately with `./scripts/build-pack-binaries.sh`
|
- **Pack Binaries**: Native binaries (sensors) built separately with `./scripts/build-pack-binaries.sh`
|
||||||
- **Action Script Resolution**: Worker constructs file paths as `{packs_base_dir}/{pack_ref}/actions/{entrypoint}`
|
- **Action Script Resolution**: Worker constructs file paths as `{packs_base_dir}/{pack_ref}/actions/{entrypoint}`
|
||||||
|
- **Workflow File Storage**: Visual workflow builder saves files to `{packs_base_dir}/{pack_ref}/actions/workflows/{name}.workflow.yaml` via `POST /api/v1/packs/{pack_ref}/workflow-files` and `PUT /api/v1/workflows/{ref}/file` endpoints
|
||||||
|
- **Task Model (Orquesta-aligned)**: Tasks are purely action invocations — there is no task `type` field or task-level `when` condition in the UI model. Parallelism is implicit (multiple `do` targets in a transition fan out into parallel branches). Conditions belong exclusively on transitions (`next[].when`). Each task has: `name`, `action`, `input`, `next` (transitions), `delay`, `retry`, `timeout`, `with_items`, `batch_size`, `concurrency`, `join`.
|
||||||
|
- The backend `Task` struct (`crates/common/src/workflow/parser.rs`) still supports `type` and task-level `when` for backward compatibility, but the UI never sets them.
|
||||||
|
- **Task Transition Model (Orquesta-style)**: Tasks use an ordered `next` array of transitions instead of flat `on_success`/`on_failure`/`on_complete`/`on_timeout` fields. Each transition has:
|
||||||
|
- `when` — condition expression (e.g., `{{ succeeded() }}`, `{{ failed() }}`, `{{ timed_out() }}`, or custom). Omit for unconditional.
|
||||||
|
- `publish` — key-value pairs to publish into the workflow context (e.g., `- result: "{{ result() }}"`)
|
||||||
|
- `do` — list of next task names to invoke when the condition is met
|
||||||
|
- `label` — optional custom display label (overrides auto-derived label from `when` expression)
|
||||||
|
- `color` — optional custom CSS color for the transition edge (e.g., `"#ff6600"`)
|
||||||
|
- **Example YAML**:
|
||||||
|
```
|
||||||
|
next:
|
||||||
|
- when: "{{ succeeded() }}"
|
||||||
|
label: "main path"
|
||||||
|
color: "#22c55e"
|
||||||
|
publish:
|
||||||
|
- msg: "task done"
|
||||||
|
do:
|
||||||
|
- log
|
||||||
|
- next_task
|
||||||
|
- when: "{{ failed() }}"
|
||||||
|
do:
|
||||||
|
- error_handler
|
||||||
|
```
|
||||||
|
- **Legacy format support**: The parser (`crates/common/src/workflow/parser.rs`) auto-converts legacy `on_success`/`on_failure`/`on_complete`/`on_timeout`/`decision` fields into `next` transitions during parsing. The canonical internal representation always uses `next`.
|
||||||
|
- **Frontend types**: `TaskTransition` in `web/src/types/workflow.ts`; `TransitionPreset` ("succeeded" | "failed" | "always") for quick-access drag handles
|
||||||
|
- **Backend types**: `TaskTransition` in `crates/common/src/workflow/parser.rs`; `GraphTransition` in `crates/executor/src/workflow/graph.rs`
|
||||||
|
- **NOT this** (legacy format): `on_success: task2` / `on_failure: error_handler` — still parsed for backward compat but normalized to `next`
|
||||||
- **Runtime YAML Loading**: Pack registration reads `runtimes/*.yaml` files and inserts them into the `runtime` table. Runtime refs use format `{pack_ref}.{name}` (e.g., `core.python`, `core.shell`).
|
- **Runtime YAML Loading**: Pack registration reads `runtimes/*.yaml` files and inserts them into the `runtime` table. Runtime refs use format `{pack_ref}.{name}` (e.g., `core.python`, `core.shell`).
|
||||||
- **Runtime Selection**: Determined by action's runtime field (e.g., "Shell", "Python") - compared case-insensitively; when an explicit `runtime_name` is set in execution context, it is authoritative (no fallback to extension matching)
|
- **Runtime Selection**: Determined by action's runtime field (e.g., "Shell", "Python") - compared case-insensitively; when an explicit `runtime_name` is set in execution context, it is authoritative (no fallback to extension matching)
|
||||||
- **Worker Runtime Loading**: Worker loads all runtimes from DB that have a non-empty `execution_config` (i.e., runtimes with an interpreter configured). Native runtimes (e.g., `core.native` with empty config) are automatically skipped since they execute binaries directly.
|
- **Worker Runtime Loading**: Worker loads all runtimes from DB that have a non-empty `execution_config` (i.e., runtimes with an interpreter configured). Native runtimes (e.g., `core.native` with empty config) are automatically skipped since they execute binaries directly.
|
||||||
- **Native Runtime Detection**: Runtime detection is purely data-driven via `execution_config` in the runtime table. A runtime with empty `execution_config` (or empty `interpreter.binary`) is native — the entrypoint is executed directly without an interpreter. There is no special "builtin" runtime concept.
|
- **Native Runtime Detection**: Runtime detection is purely data-driven via `execution_config` in the runtime table. A runtime with empty `execution_config` (or empty `interpreter.binary`) is native — the entrypoint is executed directly without an interpreter. There is no special "builtin" runtime concept.
|
||||||
- **Sensor Runtime Assignment**: Sensors declare their `runner_type` in YAML (e.g., `python`, `native`). The pack loader resolves this to the correct runtime from the database. Default is `native` (compiled binary, no interpreter). Legacy values `standalone` and `builtin` map to `core.native`.
|
- **Sensor Runtime Assignment**: Sensors declare their `runner_type` in YAML (e.g., `python`, `native`). The pack loader resolves this to the correct runtime from the database. Default is `native` (compiled binary, no interpreter). Legacy values `standalone` and `builtin` map to `core.native`.
|
||||||
- **Runtime Environment Setup**: Worker creates isolated environments (virtualenvs, node_modules) on-demand at `{runtime_envs_dir}/{pack_ref}/{runtime_name}` before first execution; setup is idempotent
|
- **Runtime Environment Setup**: Worker creates isolated environments (virtualenvs, node_modules) on-demand at `{runtime_envs_dir}/{pack_ref}/{runtime_name}` before first execution; setup is idempotent
|
||||||
|
- **Schema Format (Unified)**: ALL schemas (`param_schema`, `out_schema`, `conf_schema`) use the same flat format with `required` and `secret` inlined per-parameter (NOT standard JSON Schema). Stored as JSONB columns.
|
||||||
|
- **Example YAML**: `parameters:\n url:\n type: string\n required: true\n token:\n type: string\n secret: true`
|
||||||
|
- **Stored JSON**: `{"url": {"type": "string", "required": true}, "token": {"type": "string", "secret": true}}`
|
||||||
|
- **NOT this** (legacy JSON Schema): `{"type": "object", "properties": {"url": {"type": "string"}}, "required": ["url"]}`
|
||||||
|
- **Web UI**: `extractProperties()` in `ParamSchemaForm.tsx` is the single extraction function for all schema types. Only handles flat format.
|
||||||
|
- **SchemaBuilder**: Visual schema editor reads and writes flat format with `required` and `secret` checkboxes per parameter.
|
||||||
|
- **Backend Validation**: `flat_to_json_schema()` in `crates/api/src/validation/params.rs` converts flat format to JSON Schema internally for `jsonschema` crate validation. This conversion is an implementation detail — external interfaces always use flat format.
|
||||||
- **Parameter Delivery**: Actions receive parameters via stdin as JSON (never environment variables)
|
- **Parameter Delivery**: Actions receive parameters via stdin as JSON (never environment variables)
|
||||||
- **Output Format**: Actions declare output format (text/json/yaml) - json/yaml are parsed into execution.result JSONB
|
- **Output Format**: Actions declare output format (text/json/yaml) - json/yaml are parsed into execution.result JSONB
|
||||||
- **Standard Environment Variables**: Worker provides execution context via `ATTUNE_*` environment variables:
|
- **Standard Environment Variables**: Worker provides execution context via `ATTUNE_*` environment variables:
|
||||||
@@ -275,6 +310,16 @@ Rule `action_params` support Jinja2-style `{{ source.path }}` templates resolved
|
|||||||
- **Styling**: Tailwind utility classes
|
- **Styling**: Tailwind utility classes
|
||||||
- **Dev Server**: `npm run dev` (typically :3000 or :5173)
|
- **Dev Server**: `npm run dev` (typically :3000 or :5173)
|
||||||
- **Build**: `npm run build`
|
- **Build**: `npm run build`
|
||||||
|
- **Workflow Builder**: Visual node-based workflow editor at `/actions/workflows/new` and `/actions/workflows/:ref/edit`
|
||||||
|
- Components in `web/src/components/workflows/` (ActionPalette, WorkflowCanvas, TaskNode, WorkflowEdges, TaskInspector)
|
||||||
|
- Types and conversion utilities in `web/src/types/workflow.ts`
|
||||||
|
- Hooks in `web/src/hooks/useWorkflows.ts`
|
||||||
|
- Saves workflow files to `{packs_base_dir}/{pack_ref}/actions/workflows/{name}.workflow.yaml` via dedicated API endpoints
|
||||||
|
- **Visual / Raw YAML toggle**: Toolbar has a segmented toggle to switch between the visual node-based builder and a full-width read-only YAML preview (generated via `js-yaml`). Raw YAML mode replaces the canvas, palette, and inspector with the effective workflow definition.
|
||||||
|
- **Drag-handle connections**: TaskNode has output handles (green=succeeded, red=failed, gray=always) and an input handle (top). Drag from an output handle to another node's input handle to create a transition.
|
||||||
|
- **Transition customization**: Users can rename transitions (custom `label`) and assign custom colors (CSS color string or preset swatches) via the TaskInspector. Custom colors/labels are persisted in the workflow YAML and rendered on the canvas edges.
|
||||||
|
- **Orquesta-style `next` transitions**: Tasks use a `next: TaskTransition[]` array instead of flat `on_success`/`on_failure` fields. Each transition has `when` (condition), `publish` (variables), `do` (target tasks), plus optional `label` and `color`. See "Task Transition Model" above.
|
||||||
|
- **No task type or task-level condition**: The UI does not expose task `type` or task-level `when` — all tasks are actions (workflows are also actions), and conditions belong on transitions. Parallelism is implicit via multiple `do` targets.
|
||||||
|
|
||||||
## Development Workflow
|
## Development Workflow
|
||||||
|
|
||||||
@@ -425,7 +470,7 @@ When reporting, ask: "Should I fix this first or continue with [original task]?"
|
|||||||
- **Web UI**: Static files served separately or via API service
|
- **Web UI**: Static files served separately or via API service
|
||||||
|
|
||||||
## Current Development Status
|
## Current Development Status
|
||||||
- ✅ **Complete**: Database migrations (17 tables), API service (most endpoints), common library, message queue infrastructure, repository layer, JWT auth, CLI tool, Web UI (basic), Executor service (core functionality), Worker service (shell/Python execution)
|
- ✅ **Complete**: Database migrations (17 tables), API service (most endpoints), common library, message queue infrastructure, repository layer, JWT auth, CLI tool, Web UI (basic + workflow builder), Executor service (core functionality), Worker service (shell/Python execution)
|
||||||
- 🔄 **In Progress**: Sensor service, advanced workflow features, Python runtime dependency management
|
- 🔄 **In Progress**: Sensor service, advanced workflow features, Python runtime dependency management
|
||||||
- 📋 **Planned**: Notifier service, execution policies, monitoring, pack registry system
|
- 📋 **Planned**: Notifier service, execution policies, monitoring, pack registry system
|
||||||
|
|
||||||
|
|||||||
@@ -38,14 +38,14 @@ pub struct CreateActionRequest {
|
|||||||
#[schema(example = 1)]
|
#[schema(example = 1)]
|
||||||
pub runtime: Option<i64>,
|
pub runtime: Option<i64>,
|
||||||
|
|
||||||
/// Parameter schema (JSON Schema) defining expected inputs
|
/// Parameter schema (StackStorm-style) defining expected inputs with inline required/secret
|
||||||
#[serde(skip_serializing_if = "Option::is_none")]
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
#[schema(value_type = Object, nullable = true, example = json!({"type": "object", "properties": {"channel": {"type": "string"}, "message": {"type": "string"}}}))]
|
#[schema(value_type = Object, nullable = true, example = json!({"channel": {"type": "string", "description": "Slack channel", "required": true}, "message": {"type": "string", "description": "Message text", "required": true}}))]
|
||||||
pub param_schema: Option<JsonValue>,
|
pub param_schema: Option<JsonValue>,
|
||||||
|
|
||||||
/// Output schema (JSON Schema) defining expected outputs
|
/// Output schema (flat format) defining expected outputs with inline required/secret
|
||||||
#[serde(skip_serializing_if = "Option::is_none")]
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
#[schema(value_type = Object, nullable = true, example = json!({"type": "object", "properties": {"message_id": {"type": "string"}}}))]
|
#[schema(value_type = Object, nullable = true, example = json!({"message_id": {"type": "string", "description": "ID of the sent message", "required": true}}))]
|
||||||
pub out_schema: Option<JsonValue>,
|
pub out_schema: Option<JsonValue>,
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -71,7 +71,7 @@ pub struct UpdateActionRequest {
|
|||||||
#[schema(example = 1)]
|
#[schema(example = 1)]
|
||||||
pub runtime: Option<i64>,
|
pub runtime: Option<i64>,
|
||||||
|
|
||||||
/// Parameter schema
|
/// Parameter schema (StackStorm-style with inline required/secret)
|
||||||
#[schema(value_type = Object, nullable = true)]
|
#[schema(value_type = Object, nullable = true)]
|
||||||
pub param_schema: Option<JsonValue>,
|
pub param_schema: Option<JsonValue>,
|
||||||
|
|
||||||
@@ -115,7 +115,7 @@ pub struct ActionResponse {
|
|||||||
#[schema(example = 1)]
|
#[schema(example = 1)]
|
||||||
pub runtime: Option<i64>,
|
pub runtime: Option<i64>,
|
||||||
|
|
||||||
/// Parameter schema
|
/// Parameter schema (StackStorm-style with inline required/secret)
|
||||||
#[schema(value_type = Object, nullable = true)]
|
#[schema(value_type = Object, nullable = true)]
|
||||||
pub param_schema: Option<JsonValue>,
|
pub param_schema: Option<JsonValue>,
|
||||||
|
|
||||||
|
|||||||
@@ -137,8 +137,8 @@ pub struct CreateInquiryRequest {
|
|||||||
#[schema(example = "Approve deployment to production?")]
|
#[schema(example = "Approve deployment to production?")]
|
||||||
pub prompt: String,
|
pub prompt: String,
|
||||||
|
|
||||||
/// Optional JSON schema for the expected response format
|
/// Optional schema for the expected response format (flat format with inline required/secret)
|
||||||
#[schema(value_type = Object, example = json!({"type": "object", "properties": {"approved": {"type": "boolean"}}}))]
|
#[schema(value_type = Object, example = json!({"approved": {"type": "boolean", "description": "Whether the deployment is approved", "required": true}}))]
|
||||||
pub response_schema: Option<JsonSchema>,
|
pub response_schema: Option<JsonSchema>,
|
||||||
|
|
||||||
/// Optional identity ID to assign this inquiry to
|
/// Optional identity ID to assign this inquiry to
|
||||||
|
|||||||
@@ -28,9 +28,9 @@ pub struct CreatePackRequest {
|
|||||||
#[schema(example = "1.0.0")]
|
#[schema(example = "1.0.0")]
|
||||||
pub version: String,
|
pub version: String,
|
||||||
|
|
||||||
/// Configuration schema (JSON Schema)
|
/// Configuration schema (flat format with inline required/secret per parameter)
|
||||||
#[serde(default = "default_empty_object")]
|
#[serde(default = "default_empty_object")]
|
||||||
#[schema(value_type = Object, example = json!({"type": "object", "properties": {"api_token": {"type": "string"}}}))]
|
#[schema(value_type = Object, example = json!({"api_token": {"type": "string", "description": "API authentication key", "required": true, "secret": true}}))]
|
||||||
pub conf_schema: JsonValue,
|
pub conf_schema: JsonValue,
|
||||||
|
|
||||||
/// Pack configuration values
|
/// Pack configuration values
|
||||||
@@ -95,11 +95,6 @@ pub struct InstallPackRequest {
|
|||||||
#[schema(example = "main")]
|
#[schema(example = "main")]
|
||||||
pub ref_spec: Option<String>,
|
pub ref_spec: Option<String>,
|
||||||
|
|
||||||
/// Force reinstall if pack already exists
|
|
||||||
#[serde(default)]
|
|
||||||
#[schema(example = false)]
|
|
||||||
pub force: bool,
|
|
||||||
|
|
||||||
/// Skip running pack tests during installation
|
/// Skip running pack tests during installation
|
||||||
#[serde(default)]
|
#[serde(default)]
|
||||||
#[schema(example = false)]
|
#[schema(example = false)]
|
||||||
|
|||||||
@@ -28,14 +28,14 @@ pub struct CreateTriggerRequest {
|
|||||||
#[schema(example = "Triggers when a webhook is received")]
|
#[schema(example = "Triggers when a webhook is received")]
|
||||||
pub description: Option<String>,
|
pub description: Option<String>,
|
||||||
|
|
||||||
/// Parameter schema (JSON Schema) defining event payload structure
|
/// Parameter schema (StackStorm-style) defining trigger configuration with inline required/secret
|
||||||
#[serde(skip_serializing_if = "Option::is_none")]
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
#[schema(value_type = Object, nullable = true, example = json!({"type": "object", "properties": {"url": {"type": "string"}}}))]
|
#[schema(value_type = Object, nullable = true, example = json!({"url": {"type": "string", "description": "Webhook URL", "required": true}}))]
|
||||||
pub param_schema: Option<JsonValue>,
|
pub param_schema: Option<JsonValue>,
|
||||||
|
|
||||||
/// Output schema (JSON Schema) defining event data structure
|
/// Output schema (flat format) defining event data structure with inline required/secret
|
||||||
#[serde(skip_serializing_if = "Option::is_none")]
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
#[schema(value_type = Object, nullable = true, example = json!({"type": "object", "properties": {"payload": {"type": "object"}}}))]
|
#[schema(value_type = Object, nullable = true, example = json!({"payload": {"type": "object", "description": "Event payload data", "required": true}}))]
|
||||||
pub out_schema: Option<JsonValue>,
|
pub out_schema: Option<JsonValue>,
|
||||||
|
|
||||||
/// Whether the trigger is enabled
|
/// Whether the trigger is enabled
|
||||||
@@ -56,7 +56,7 @@ pub struct UpdateTriggerRequest {
|
|||||||
#[schema(example = "Updated webhook trigger description")]
|
#[schema(example = "Updated webhook trigger description")]
|
||||||
pub description: Option<String>,
|
pub description: Option<String>,
|
||||||
|
|
||||||
/// Parameter schema
|
/// Parameter schema (StackStorm-style with inline required/secret)
|
||||||
#[schema(value_type = Object, nullable = true)]
|
#[schema(value_type = Object, nullable = true)]
|
||||||
pub param_schema: Option<JsonValue>,
|
pub param_schema: Option<JsonValue>,
|
||||||
|
|
||||||
@@ -100,7 +100,7 @@ pub struct TriggerResponse {
|
|||||||
#[schema(example = true)]
|
#[schema(example = true)]
|
||||||
pub enabled: bool,
|
pub enabled: bool,
|
||||||
|
|
||||||
/// Parameter schema
|
/// Parameter schema (StackStorm-style with inline required/secret)
|
||||||
#[schema(value_type = Object, nullable = true)]
|
#[schema(value_type = Object, nullable = true)]
|
||||||
pub param_schema: Option<JsonValue>,
|
pub param_schema: Option<JsonValue>,
|
||||||
|
|
||||||
@@ -208,9 +208,9 @@ pub struct CreateSensorRequest {
|
|||||||
#[schema(example = "monitoring.cpu_threshold")]
|
#[schema(example = "monitoring.cpu_threshold")]
|
||||||
pub trigger_ref: String,
|
pub trigger_ref: String,
|
||||||
|
|
||||||
/// Parameter schema (JSON Schema) for sensor configuration
|
/// Parameter schema (flat format) for sensor configuration
|
||||||
#[serde(skip_serializing_if = "Option::is_none")]
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
#[schema(value_type = Object, nullable = true, example = json!({"type": "object", "properties": {"threshold": {"type": "number"}}}))]
|
#[schema(value_type = Object, nullable = true, example = json!({"threshold": {"type": "number", "description": "Alert threshold", "required": true}}))]
|
||||||
pub param_schema: Option<JsonValue>,
|
pub param_schema: Option<JsonValue>,
|
||||||
|
|
||||||
/// Configuration values for this sensor instance (conforms to param_schema)
|
/// Configuration values for this sensor instance (conforms to param_schema)
|
||||||
@@ -242,7 +242,7 @@ pub struct UpdateSensorRequest {
|
|||||||
#[schema(example = "/sensors/monitoring/cpu_monitor_v2.py")]
|
#[schema(example = "/sensors/monitoring/cpu_monitor_v2.py")]
|
||||||
pub entrypoint: Option<String>,
|
pub entrypoint: Option<String>,
|
||||||
|
|
||||||
/// Parameter schema
|
/// Parameter schema (StackStorm-style with inline required/secret)
|
||||||
#[schema(value_type = Object, nullable = true)]
|
#[schema(value_type = Object, nullable = true)]
|
||||||
pub param_schema: Option<JsonValue>,
|
pub param_schema: Option<JsonValue>,
|
||||||
|
|
||||||
@@ -302,7 +302,7 @@ pub struct SensorResponse {
|
|||||||
#[schema(example = true)]
|
#[schema(example = true)]
|
||||||
pub enabled: bool,
|
pub enabled: bool,
|
||||||
|
|
||||||
/// Parameter schema
|
/// Parameter schema (StackStorm-style with inline required/secret)
|
||||||
#[schema(value_type = Object, nullable = true)]
|
#[schema(value_type = Object, nullable = true)]
|
||||||
pub param_schema: Option<JsonValue>,
|
pub param_schema: Option<JsonValue>,
|
||||||
|
|
||||||
|
|||||||
@@ -6,6 +6,54 @@ use serde_json::Value as JsonValue;
|
|||||||
use utoipa::{IntoParams, ToSchema};
|
use utoipa::{IntoParams, ToSchema};
|
||||||
use validator::Validate;
|
use validator::Validate;
|
||||||
|
|
||||||
|
/// Request DTO for saving a workflow file to disk and syncing to DB
|
||||||
|
#[derive(Debug, Clone, Deserialize, Validate, ToSchema)]
|
||||||
|
pub struct SaveWorkflowFileRequest {
|
||||||
|
/// Workflow name (becomes filename: {name}.workflow.yaml)
|
||||||
|
#[validate(length(min = 1, max = 255))]
|
||||||
|
#[schema(example = "deploy_app")]
|
||||||
|
pub name: String,
|
||||||
|
|
||||||
|
/// Human-readable label
|
||||||
|
#[validate(length(min = 1, max = 255))]
|
||||||
|
#[schema(example = "Deploy Application")]
|
||||||
|
pub label: String,
|
||||||
|
|
||||||
|
/// Workflow description
|
||||||
|
#[schema(example = "Deploys an application to the target environment")]
|
||||||
|
pub description: Option<String>,
|
||||||
|
|
||||||
|
/// Workflow version (semantic versioning recommended)
|
||||||
|
#[validate(length(min = 1, max = 50))]
|
||||||
|
#[schema(example = "1.0.0")]
|
||||||
|
pub version: String,
|
||||||
|
|
||||||
|
/// Pack reference this workflow belongs to
|
||||||
|
#[validate(length(min = 1, max = 255))]
|
||||||
|
#[schema(example = "core")]
|
||||||
|
pub pack_ref: String,
|
||||||
|
|
||||||
|
/// The full workflow definition as JSON (will be serialized to YAML on disk)
|
||||||
|
#[schema(value_type = Object)]
|
||||||
|
pub definition: JsonValue,
|
||||||
|
|
||||||
|
/// Parameter schema (flat format with inline required/secret)
|
||||||
|
#[schema(value_type = Object, nullable = true)]
|
||||||
|
pub param_schema: Option<JsonValue>,
|
||||||
|
|
||||||
|
/// Output schema (flat format)
|
||||||
|
#[schema(value_type = Object, nullable = true)]
|
||||||
|
pub out_schema: Option<JsonValue>,
|
||||||
|
|
||||||
|
/// Tags for categorization
|
||||||
|
#[schema(example = json!(["deployment", "automation"]))]
|
||||||
|
pub tags: Option<Vec<String>>,
|
||||||
|
|
||||||
|
/// Whether the workflow is enabled
|
||||||
|
#[schema(example = true)]
|
||||||
|
pub enabled: Option<bool>,
|
||||||
|
}
|
||||||
|
|
||||||
/// Request DTO for creating a new workflow
|
/// Request DTO for creating a new workflow
|
||||||
#[derive(Debug, Clone, Deserialize, Validate, ToSchema)]
|
#[derive(Debug, Clone, Deserialize, Validate, ToSchema)]
|
||||||
pub struct CreateWorkflowRequest {
|
pub struct CreateWorkflowRequest {
|
||||||
@@ -33,12 +81,12 @@ pub struct CreateWorkflowRequest {
|
|||||||
#[schema(example = "1.0.0")]
|
#[schema(example = "1.0.0")]
|
||||||
pub version: String,
|
pub version: String,
|
||||||
|
|
||||||
/// Parameter schema (JSON Schema) defining expected inputs
|
/// Parameter schema (StackStorm-style) defining expected inputs with inline required/secret
|
||||||
#[schema(value_type = Object, example = json!({"type": "object", "properties": {"severity": {"type": "string"}, "channel": {"type": "string"}}}))]
|
#[schema(value_type = Object, example = json!({"severity": {"type": "string", "description": "Incident severity", "required": true}, "channel": {"type": "string", "description": "Notification channel"}}))]
|
||||||
pub param_schema: Option<JsonValue>,
|
pub param_schema: Option<JsonValue>,
|
||||||
|
|
||||||
/// Output schema (JSON Schema) defining expected outputs
|
/// Output schema (flat format) defining expected outputs with inline required/secret
|
||||||
#[schema(value_type = Object, example = json!({"type": "object", "properties": {"incident_id": {"type": "string"}}}))]
|
#[schema(value_type = Object, example = json!({"incident_id": {"type": "string", "description": "Unique incident identifier", "required": true}}))]
|
||||||
pub out_schema: Option<JsonValue>,
|
pub out_schema: Option<JsonValue>,
|
||||||
|
|
||||||
/// Workflow definition (complete workflow YAML structure as JSON)
|
/// Workflow definition (complete workflow YAML structure as JSON)
|
||||||
@@ -71,7 +119,7 @@ pub struct UpdateWorkflowRequest {
|
|||||||
#[schema(example = "1.1.0")]
|
#[schema(example = "1.1.0")]
|
||||||
pub version: Option<String>,
|
pub version: Option<String>,
|
||||||
|
|
||||||
/// Parameter schema
|
/// Parameter schema (StackStorm-style with inline required/secret)
|
||||||
#[schema(value_type = Object, nullable = true)]
|
#[schema(value_type = Object, nullable = true)]
|
||||||
pub param_schema: Option<JsonValue>,
|
pub param_schema: Option<JsonValue>,
|
||||||
|
|
||||||
@@ -123,7 +171,7 @@ pub struct WorkflowResponse {
|
|||||||
#[schema(example = "1.0.0")]
|
#[schema(example = "1.0.0")]
|
||||||
pub version: String,
|
pub version: String,
|
||||||
|
|
||||||
/// Parameter schema
|
/// Parameter schema (StackStorm-style with inline required/secret)
|
||||||
#[schema(value_type = Object, nullable = true)]
|
#[schema(value_type = Object, nullable = true)]
|
||||||
pub param_schema: Option<JsonValue>,
|
pub param_schema: Option<JsonValue>,
|
||||||
|
|
||||||
|
|||||||
@@ -40,7 +40,9 @@ use crate::{
|
|||||||
#[derive(Debug, Clone, Serialize, Deserialize, Validate, ToSchema)]
|
#[derive(Debug, Clone, Serialize, Deserialize, Validate, ToSchema)]
|
||||||
pub struct CreateEventRequest {
|
pub struct CreateEventRequest {
|
||||||
/// Trigger reference (e.g., "core.timer", "core.webhook")
|
/// Trigger reference (e.g., "core.timer", "core.webhook")
|
||||||
|
/// Also accepts "trigger_type" for compatibility with the sensor interface spec.
|
||||||
#[validate(length(min = 1))]
|
#[validate(length(min = 1))]
|
||||||
|
#[serde(alias = "trigger_type")]
|
||||||
#[schema(example = "core.timer")]
|
#[schema(example = "core.timer")]
|
||||||
pub trigger_ref: String,
|
pub trigger_ref: String,
|
||||||
|
|
||||||
|
|||||||
@@ -10,9 +10,13 @@ use axum::{
|
|||||||
use std::sync::Arc;
|
use std::sync::Arc;
|
||||||
use validator::Validate;
|
use validator::Validate;
|
||||||
|
|
||||||
|
use attune_common::models::OwnerType;
|
||||||
use attune_common::repositories::{
|
use attune_common::repositories::{
|
||||||
|
action::ActionRepository,
|
||||||
key::{CreateKeyInput, KeyRepository, UpdateKeyInput},
|
key::{CreateKeyInput, KeyRepository, UpdateKeyInput},
|
||||||
Create, Delete, List, Update,
|
pack::PackRepository,
|
||||||
|
trigger::SensorRepository,
|
||||||
|
Create, Delete, FindByRef, List, Update,
|
||||||
};
|
};
|
||||||
|
|
||||||
use crate::auth::RequireAuth;
|
use crate::auth::RequireAuth;
|
||||||
@@ -157,6 +161,78 @@ pub async fn create_key(
|
|||||||
)));
|
)));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Auto-resolve owner IDs from refs when only the ref is provided.
|
||||||
|
// This makes the API more ergonomic for sensors and other clients that
|
||||||
|
// know the owner ref but not the numeric database ID.
|
||||||
|
let mut owner_sensor = request.owner_sensor;
|
||||||
|
let mut owner_action = request.owner_action;
|
||||||
|
let mut owner_pack = request.owner_pack;
|
||||||
|
|
||||||
|
match request.owner_type {
|
||||||
|
OwnerType::Sensor => {
|
||||||
|
if owner_sensor.is_none() {
|
||||||
|
if let Some(ref sensor_ref) = request.owner_sensor_ref {
|
||||||
|
if let Some(sensor) =
|
||||||
|
SensorRepository::find_by_ref(&state.db, sensor_ref).await?
|
||||||
|
{
|
||||||
|
tracing::debug!(
|
||||||
|
"Auto-resolved owner_sensor from ref '{}' to id {}",
|
||||||
|
sensor_ref,
|
||||||
|
sensor.id
|
||||||
|
);
|
||||||
|
owner_sensor = Some(sensor.id);
|
||||||
|
} else {
|
||||||
|
return Err(ApiError::BadRequest(format!(
|
||||||
|
"Sensor with ref '{}' not found",
|
||||||
|
sensor_ref
|
||||||
|
)));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
OwnerType::Action => {
|
||||||
|
if owner_action.is_none() {
|
||||||
|
if let Some(ref action_ref) = request.owner_action_ref {
|
||||||
|
if let Some(action) =
|
||||||
|
ActionRepository::find_by_ref(&state.db, action_ref).await?
|
||||||
|
{
|
||||||
|
tracing::debug!(
|
||||||
|
"Auto-resolved owner_action from ref '{}' to id {}",
|
||||||
|
action_ref,
|
||||||
|
action.id
|
||||||
|
);
|
||||||
|
owner_action = Some(action.id);
|
||||||
|
} else {
|
||||||
|
return Err(ApiError::BadRequest(format!(
|
||||||
|
"Action with ref '{}' not found",
|
||||||
|
action_ref
|
||||||
|
)));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
OwnerType::Pack => {
|
||||||
|
if owner_pack.is_none() {
|
||||||
|
if let Some(ref pack_ref) = request.owner_pack_ref {
|
||||||
|
if let Some(pack) = PackRepository::find_by_ref(&state.db, pack_ref).await? {
|
||||||
|
tracing::debug!(
|
||||||
|
"Auto-resolved owner_pack from ref '{}' to id {}",
|
||||||
|
pack_ref,
|
||||||
|
pack.id
|
||||||
|
);
|
||||||
|
owner_pack = Some(pack.id);
|
||||||
|
} else {
|
||||||
|
return Err(ApiError::BadRequest(format!(
|
||||||
|
"Pack with ref '{}' not found",
|
||||||
|
pack_ref
|
||||||
|
)));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
_ => {}
|
||||||
|
}
|
||||||
|
|
||||||
// Encrypt value if requested
|
// Encrypt value if requested
|
||||||
let (value, encryption_key_hash) = if request.encrypted {
|
let (value, encryption_key_hash) = if request.encrypted {
|
||||||
let encryption_key = state
|
let encryption_key = state
|
||||||
@@ -190,11 +266,11 @@ pub async fn create_key(
|
|||||||
owner_type: request.owner_type,
|
owner_type: request.owner_type,
|
||||||
owner: request.owner,
|
owner: request.owner,
|
||||||
owner_identity: request.owner_identity,
|
owner_identity: request.owner_identity,
|
||||||
owner_pack: request.owner_pack,
|
owner_pack,
|
||||||
owner_pack_ref: request.owner_pack_ref,
|
owner_pack_ref: request.owner_pack_ref,
|
||||||
owner_action: request.owner_action,
|
owner_action,
|
||||||
owner_action_ref: request.owner_action_ref,
|
owner_action_ref: request.owner_action_ref,
|
||||||
owner_sensor: request.owner_sensor,
|
owner_sensor,
|
||||||
owner_sensor_ref: request.owner_sensor_ref,
|
owner_sensor_ref: request.owner_sensor_ref,
|
||||||
name: request.name,
|
name: request.name,
|
||||||
encrypted: request.encrypted,
|
encrypted: request.encrypted,
|
||||||
|
|||||||
@@ -14,7 +14,10 @@ use validator::Validate;
|
|||||||
use attune_common::models::pack_test::PackTestResult;
|
use attune_common::models::pack_test::PackTestResult;
|
||||||
use attune_common::mq::{MessageEnvelope, MessageType, PackRegisteredPayload};
|
use attune_common::mq::{MessageEnvelope, MessageType, PackRegisteredPayload};
|
||||||
use attune_common::repositories::{
|
use attune_common::repositories::{
|
||||||
|
action::ActionRepository,
|
||||||
pack::{CreatePackInput, UpdatePackInput},
|
pack::{CreatePackInput, UpdatePackInput},
|
||||||
|
rule::{RestoreRuleInput, RuleRepository},
|
||||||
|
trigger::TriggerRepository,
|
||||||
Create, Delete, FindById, FindByRef, PackRepository, PackTestRepository, Pagination, Update,
|
Create, Delete, FindById, FindByRef, PackRepository, PackTestRepository, Pagination, Update,
|
||||||
};
|
};
|
||||||
use attune_common::workflow::{PackWorkflowService, PackWorkflowServiceConfig};
|
use attune_common::workflow::{PackWorkflowService, PackWorkflowServiceConfig};
|
||||||
@@ -545,6 +548,9 @@ async fn register_pack_internal(
|
|||||||
.and_then(|v| v.as_str())
|
.and_then(|v| v.as_str())
|
||||||
.map(|s| s.to_string());
|
.map(|s| s.to_string());
|
||||||
|
|
||||||
|
// Ad-hoc rules to restore after pack reinstallation
|
||||||
|
let mut saved_adhoc_rules: Vec<attune_common::models::rule::Rule> = Vec::new();
|
||||||
|
|
||||||
// Check if pack already exists
|
// Check if pack already exists
|
||||||
if !force {
|
if !force {
|
||||||
if PackRepository::exists_by_ref(&state.db, &pack_ref).await? {
|
if PackRepository::exists_by_ref(&state.db, &pack_ref).await? {
|
||||||
@@ -554,8 +560,20 @@ async fn register_pack_internal(
|
|||||||
)));
|
)));
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
// Delete existing pack if force is true
|
// Delete existing pack if force is true, preserving ad-hoc (user-created) rules
|
||||||
if let Some(existing_pack) = PackRepository::find_by_ref(&state.db, &pack_ref).await? {
|
if let Some(existing_pack) = PackRepository::find_by_ref(&state.db, &pack_ref).await? {
|
||||||
|
// Save ad-hoc rules before deletion — CASCADE on pack FK would destroy them
|
||||||
|
saved_adhoc_rules = RuleRepository::find_adhoc_by_pack(&state.db, existing_pack.id)
|
||||||
|
.await
|
||||||
|
.unwrap_or_default();
|
||||||
|
if !saved_adhoc_rules.is_empty() {
|
||||||
|
tracing::info!(
|
||||||
|
"Preserving {} ad-hoc rule(s) during reinstall of pack '{}'",
|
||||||
|
saved_adhoc_rules.len(),
|
||||||
|
pack_ref
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
PackRepository::delete(&state.db, existing_pack.id).await?;
|
PackRepository::delete(&state.db, existing_pack.id).await?;
|
||||||
tracing::info!("Deleted existing pack '{}' for forced reinstall", pack_ref);
|
tracing::info!("Deleted existing pack '{}' for forced reinstall", pack_ref);
|
||||||
}
|
}
|
||||||
@@ -671,6 +689,123 @@ async fn register_pack_internal(
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Restore ad-hoc rules that were saved before pack deletion, and
|
||||||
|
// re-link any rules from other packs whose action/trigger FKs were
|
||||||
|
// set to NULL when the old pack's entities were cascade-deleted.
|
||||||
|
{
|
||||||
|
// Phase 1: Restore saved ad-hoc rules
|
||||||
|
if !saved_adhoc_rules.is_empty() {
|
||||||
|
let mut restored = 0u32;
|
||||||
|
for saved_rule in &saved_adhoc_rules {
|
||||||
|
// Resolve action and trigger IDs by ref (they may have been recreated)
|
||||||
|
let action_id = ActionRepository::find_by_ref(&state.db, &saved_rule.action_ref)
|
||||||
|
.await
|
||||||
|
.ok()
|
||||||
|
.flatten()
|
||||||
|
.map(|a| a.id);
|
||||||
|
let trigger_id = TriggerRepository::find_by_ref(&state.db, &saved_rule.trigger_ref)
|
||||||
|
.await
|
||||||
|
.ok()
|
||||||
|
.flatten()
|
||||||
|
.map(|t| t.id);
|
||||||
|
|
||||||
|
let input = RestoreRuleInput {
|
||||||
|
r#ref: saved_rule.r#ref.clone(),
|
||||||
|
pack: pack.id,
|
||||||
|
pack_ref: pack.r#ref.clone(),
|
||||||
|
label: saved_rule.label.clone(),
|
||||||
|
description: saved_rule.description.clone(),
|
||||||
|
action: action_id,
|
||||||
|
action_ref: saved_rule.action_ref.clone(),
|
||||||
|
trigger: trigger_id,
|
||||||
|
trigger_ref: saved_rule.trigger_ref.clone(),
|
||||||
|
conditions: saved_rule.conditions.clone(),
|
||||||
|
action_params: saved_rule.action_params.clone(),
|
||||||
|
trigger_params: saved_rule.trigger_params.clone(),
|
||||||
|
enabled: saved_rule.enabled,
|
||||||
|
};
|
||||||
|
|
||||||
|
match RuleRepository::restore_rule(&state.db, input).await {
|
||||||
|
Ok(rule) => {
|
||||||
|
restored += 1;
|
||||||
|
if rule.action.is_none() || rule.trigger.is_none() {
|
||||||
|
tracing::warn!(
|
||||||
|
"Restored ad-hoc rule '{}' with unresolved references \
|
||||||
|
(action: {}, trigger: {})",
|
||||||
|
rule.r#ref,
|
||||||
|
if rule.action.is_some() {
|
||||||
|
"linked"
|
||||||
|
} else {
|
||||||
|
"NULL"
|
||||||
|
},
|
||||||
|
if rule.trigger.is_some() {
|
||||||
|
"linked"
|
||||||
|
} else {
|
||||||
|
"NULL"
|
||||||
|
},
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Err(e) => {
|
||||||
|
tracing::warn!(
|
||||||
|
"Failed to restore ad-hoc rule '{}': {}",
|
||||||
|
saved_rule.r#ref,
|
||||||
|
e
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
tracing::info!(
|
||||||
|
"Restored {}/{} ad-hoc rule(s) for pack '{}'",
|
||||||
|
restored,
|
||||||
|
saved_adhoc_rules.len(),
|
||||||
|
pack.r#ref
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Phase 2: Re-link rules from other packs whose action/trigger FKs
|
||||||
|
// were set to NULL when the old pack's entities were cascade-deleted
|
||||||
|
let new_actions = ActionRepository::find_by_pack(&state.db, pack.id)
|
||||||
|
.await
|
||||||
|
.unwrap_or_default();
|
||||||
|
let new_triggers = TriggerRepository::find_by_pack(&state.db, pack.id)
|
||||||
|
.await
|
||||||
|
.unwrap_or_default();
|
||||||
|
|
||||||
|
for action in &new_actions {
|
||||||
|
match RuleRepository::relink_action_by_ref(&state.db, &action.r#ref, action.id).await {
|
||||||
|
Ok(count) if count > 0 => {
|
||||||
|
tracing::info!("Re-linked {} rule(s) to action '{}'", count, action.r#ref);
|
||||||
|
}
|
||||||
|
Err(e) => {
|
||||||
|
tracing::warn!(
|
||||||
|
"Failed to re-link rules to action '{}': {}",
|
||||||
|
action.r#ref,
|
||||||
|
e
|
||||||
|
);
|
||||||
|
}
|
||||||
|
_ => {}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
for trigger in &new_triggers {
|
||||||
|
match RuleRepository::relink_trigger_by_ref(&state.db, &trigger.r#ref, trigger.id).await
|
||||||
|
{
|
||||||
|
Ok(count) if count > 0 => {
|
||||||
|
tracing::info!("Re-linked {} rule(s) to trigger '{}'", count, trigger.r#ref);
|
||||||
|
}
|
||||||
|
Err(e) => {
|
||||||
|
tracing::warn!(
|
||||||
|
"Failed to re-link rules to trigger '{}': {}",
|
||||||
|
trigger.r#ref,
|
||||||
|
e
|
||||||
|
);
|
||||||
|
}
|
||||||
|
_ => {}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// Set up runtime environments for the pack's actions.
|
// Set up runtime environments for the pack's actions.
|
||||||
// This creates virtualenvs, installs dependencies, etc. based on each
|
// This creates virtualenvs, installs dependencies, etc. based on each
|
||||||
// runtime's execution_config from the database.
|
// runtime's execution_config from the database.
|
||||||
@@ -964,7 +1099,6 @@ async fn register_pack_internal(
|
|||||||
responses(
|
responses(
|
||||||
(status = 201, description = "Pack installed successfully", body = ApiResponse<PackInstallResponse>),
|
(status = 201, description = "Pack installed successfully", body = ApiResponse<PackInstallResponse>),
|
||||||
(status = 400, description = "Invalid request or tests failed", body = ApiResponse<String>),
|
(status = 400, description = "Invalid request or tests failed", body = ApiResponse<String>),
|
||||||
(status = 409, description = "Pack already exists", body = ApiResponse<String>),
|
|
||||||
(status = 501, description = "Not implemented yet", body = ApiResponse<String>),
|
(status = 501, description = "Not implemented yet", body = ApiResponse<String>),
|
||||||
),
|
),
|
||||||
security(("bearer_auth" = []))
|
security(("bearer_auth" = []))
|
||||||
@@ -1122,12 +1256,14 @@ pub async fn install_pack(
|
|||||||
|
|
||||||
tracing::info!("Pack moved to permanent storage: {:?}", final_path);
|
tracing::info!("Pack moved to permanent storage: {:?}", final_path);
|
||||||
|
|
||||||
// Register the pack in database (from permanent storage location)
|
// Register the pack in database (from permanent storage location).
|
||||||
|
// Remote installs always force-overwrite: if you're pulling from a remote,
|
||||||
|
// the intent is to get that pack installed regardless of local state.
|
||||||
let pack_id = register_pack_internal(
|
let pack_id = register_pack_internal(
|
||||||
state.clone(),
|
state.clone(),
|
||||||
user_sub,
|
user_sub,
|
||||||
final_path.to_string_lossy().to_string(),
|
final_path.to_string_lossy().to_string(),
|
||||||
request.force,
|
true, // always force for remote installs
|
||||||
request.skip_tests,
|
request.skip_tests,
|
||||||
)
|
)
|
||||||
.await
|
.await
|
||||||
|
|||||||
@@ -4,9 +4,10 @@ use axum::{
|
|||||||
extract::{Path, Query, State},
|
extract::{Path, Query, State},
|
||||||
http::StatusCode,
|
http::StatusCode,
|
||||||
response::IntoResponse,
|
response::IntoResponse,
|
||||||
routing::get,
|
routing::{get, post, put},
|
||||||
Json, Router,
|
Json, Router,
|
||||||
};
|
};
|
||||||
|
use std::path::PathBuf;
|
||||||
use std::sync::Arc;
|
use std::sync::Arc;
|
||||||
use validator::Validate;
|
use validator::Validate;
|
||||||
|
|
||||||
@@ -23,8 +24,8 @@ use crate::{
|
|||||||
dto::{
|
dto::{
|
||||||
common::{PaginatedResponse, PaginationParams},
|
common::{PaginatedResponse, PaginationParams},
|
||||||
workflow::{
|
workflow::{
|
||||||
CreateWorkflowRequest, UpdateWorkflowRequest, WorkflowResponse, WorkflowSearchParams,
|
CreateWorkflowRequest, SaveWorkflowFileRequest, UpdateWorkflowRequest,
|
||||||
WorkflowSummary,
|
WorkflowResponse, WorkflowSearchParams, WorkflowSummary,
|
||||||
},
|
},
|
||||||
ApiResponse, SuccessResponse,
|
ApiResponse, SuccessResponse,
|
||||||
},
|
},
|
||||||
@@ -340,6 +341,202 @@ pub async fn delete_workflow(
|
|||||||
Ok((StatusCode::OK, Json(response)))
|
Ok((StatusCode::OK, Json(response)))
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Save a workflow file to disk and sync it to the database
|
||||||
|
///
|
||||||
|
/// Writes a `{name}.workflow.yaml` file to `{packs_base_dir}/{pack_ref}/actions/workflows/`
|
||||||
|
/// and creates or updates the corresponding workflow_definition record in the database.
|
||||||
|
#[utoipa::path(
|
||||||
|
post,
|
||||||
|
path = "/api/v1/packs/{pack_ref}/workflow-files",
|
||||||
|
tag = "workflows",
|
||||||
|
params(
|
||||||
|
("pack_ref" = String, Path, description = "Pack reference identifier")
|
||||||
|
),
|
||||||
|
request_body = SaveWorkflowFileRequest,
|
||||||
|
responses(
|
||||||
|
(status = 201, description = "Workflow file saved and synced", body = inline(ApiResponse<WorkflowResponse>)),
|
||||||
|
(status = 400, description = "Validation error"),
|
||||||
|
(status = 404, description = "Pack not found"),
|
||||||
|
(status = 409, description = "Workflow with same ref already exists"),
|
||||||
|
(status = 500, description = "Failed to write workflow file")
|
||||||
|
),
|
||||||
|
security(("bearer_auth" = []))
|
||||||
|
)]
|
||||||
|
pub async fn save_workflow_file(
|
||||||
|
State(state): State<Arc<AppState>>,
|
||||||
|
RequireAuth(_user): RequireAuth,
|
||||||
|
Path(pack_ref): Path<String>,
|
||||||
|
Json(request): Json<SaveWorkflowFileRequest>,
|
||||||
|
) -> ApiResult<impl IntoResponse> {
|
||||||
|
request.validate()?;
|
||||||
|
|
||||||
|
// Verify pack exists
|
||||||
|
let pack = PackRepository::find_by_ref(&state.db, &pack_ref)
|
||||||
|
.await?
|
||||||
|
.ok_or_else(|| ApiError::NotFound(format!("Pack '{}' not found", pack_ref)))?;
|
||||||
|
|
||||||
|
let workflow_ref = format!("{}.{}", pack_ref, request.name);
|
||||||
|
|
||||||
|
// Check if workflow already exists
|
||||||
|
if WorkflowDefinitionRepository::find_by_ref(&state.db, &workflow_ref)
|
||||||
|
.await?
|
||||||
|
.is_some()
|
||||||
|
{
|
||||||
|
return Err(ApiError::Conflict(format!(
|
||||||
|
"Workflow with ref '{}' already exists",
|
||||||
|
workflow_ref
|
||||||
|
)));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Write YAML file to disk
|
||||||
|
let packs_base_dir = PathBuf::from(&state.config.packs_base_dir);
|
||||||
|
write_workflow_yaml(&packs_base_dir, &pack_ref, &request).await?;
|
||||||
|
|
||||||
|
// Create workflow in database
|
||||||
|
let definition_json = serde_json::to_value(&request.definition).map_err(|e| {
|
||||||
|
ApiError::BadRequest(format!("Failed to serialize workflow definition: {}", e))
|
||||||
|
})?;
|
||||||
|
|
||||||
|
let workflow_input = CreateWorkflowDefinitionInput {
|
||||||
|
r#ref: workflow_ref,
|
||||||
|
pack: pack.id,
|
||||||
|
pack_ref: pack.r#ref.clone(),
|
||||||
|
label: request.label,
|
||||||
|
description: request.description,
|
||||||
|
version: request.version,
|
||||||
|
param_schema: request.param_schema,
|
||||||
|
out_schema: request.out_schema,
|
||||||
|
definition: definition_json,
|
||||||
|
tags: request.tags.unwrap_or_default(),
|
||||||
|
enabled: request.enabled.unwrap_or(true),
|
||||||
|
};
|
||||||
|
|
||||||
|
let workflow = WorkflowDefinitionRepository::create(&state.db, workflow_input).await?;
|
||||||
|
|
||||||
|
let response = ApiResponse::with_message(
|
||||||
|
WorkflowResponse::from(workflow),
|
||||||
|
"Workflow file saved and synced successfully",
|
||||||
|
);
|
||||||
|
|
||||||
|
Ok((StatusCode::CREATED, Json(response)))
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Update a workflow file on disk and sync changes to the database
|
||||||
|
#[utoipa::path(
|
||||||
|
put,
|
||||||
|
path = "/api/v1/workflows/{ref}/file",
|
||||||
|
tag = "workflows",
|
||||||
|
params(
|
||||||
|
("ref" = String, Path, description = "Workflow reference identifier")
|
||||||
|
),
|
||||||
|
request_body = SaveWorkflowFileRequest,
|
||||||
|
responses(
|
||||||
|
(status = 200, description = "Workflow file updated and synced", body = inline(ApiResponse<WorkflowResponse>)),
|
||||||
|
(status = 400, description = "Validation error"),
|
||||||
|
(status = 404, description = "Workflow not found"),
|
||||||
|
(status = 500, description = "Failed to write workflow file")
|
||||||
|
),
|
||||||
|
security(("bearer_auth" = []))
|
||||||
|
)]
|
||||||
|
pub async fn update_workflow_file(
|
||||||
|
State(state): State<Arc<AppState>>,
|
||||||
|
RequireAuth(_user): RequireAuth,
|
||||||
|
Path(workflow_ref): Path<String>,
|
||||||
|
Json(request): Json<SaveWorkflowFileRequest>,
|
||||||
|
) -> ApiResult<impl IntoResponse> {
|
||||||
|
request.validate()?;
|
||||||
|
|
||||||
|
// Check if workflow exists
|
||||||
|
let existing_workflow = WorkflowDefinitionRepository::find_by_ref(&state.db, &workflow_ref)
|
||||||
|
.await?
|
||||||
|
.ok_or_else(|| ApiError::NotFound(format!("Workflow '{}' not found", workflow_ref)))?;
|
||||||
|
|
||||||
|
// Verify pack exists
|
||||||
|
let _pack = PackRepository::find_by_ref(&state.db, &request.pack_ref)
|
||||||
|
.await?
|
||||||
|
.ok_or_else(|| ApiError::NotFound(format!("Pack '{}' not found", request.pack_ref)))?;
|
||||||
|
|
||||||
|
// Write updated YAML file to disk
|
||||||
|
let packs_base_dir = PathBuf::from(&state.config.packs_base_dir);
|
||||||
|
write_workflow_yaml(&packs_base_dir, &request.pack_ref, &request).await?;
|
||||||
|
|
||||||
|
// Update workflow in database
|
||||||
|
let definition_json = serde_json::to_value(&request.definition).map_err(|e| {
|
||||||
|
ApiError::BadRequest(format!("Failed to serialize workflow definition: {}", e))
|
||||||
|
})?;
|
||||||
|
|
||||||
|
let update_input = UpdateWorkflowDefinitionInput {
|
||||||
|
label: Some(request.label),
|
||||||
|
description: request.description,
|
||||||
|
version: Some(request.version),
|
||||||
|
param_schema: request.param_schema,
|
||||||
|
out_schema: request.out_schema,
|
||||||
|
definition: Some(definition_json),
|
||||||
|
tags: request.tags,
|
||||||
|
enabled: request.enabled,
|
||||||
|
};
|
||||||
|
|
||||||
|
let workflow =
|
||||||
|
WorkflowDefinitionRepository::update(&state.db, existing_workflow.id, update_input).await?;
|
||||||
|
|
||||||
|
let response = ApiResponse::with_message(
|
||||||
|
WorkflowResponse::from(workflow),
|
||||||
|
"Workflow file updated and synced successfully",
|
||||||
|
);
|
||||||
|
|
||||||
|
Ok((StatusCode::OK, Json(response)))
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Write a workflow definition to disk as YAML
|
||||||
|
async fn write_workflow_yaml(
|
||||||
|
packs_base_dir: &PathBuf,
|
||||||
|
pack_ref: &str,
|
||||||
|
request: &SaveWorkflowFileRequest,
|
||||||
|
) -> Result<(), ApiError> {
|
||||||
|
let workflows_dir = packs_base_dir
|
||||||
|
.join(pack_ref)
|
||||||
|
.join("actions")
|
||||||
|
.join("workflows");
|
||||||
|
|
||||||
|
// Ensure the directory exists
|
||||||
|
tokio::fs::create_dir_all(&workflows_dir)
|
||||||
|
.await
|
||||||
|
.map_err(|e| {
|
||||||
|
ApiError::InternalServerError(format!(
|
||||||
|
"Failed to create workflows directory '{}': {}",
|
||||||
|
workflows_dir.display(),
|
||||||
|
e
|
||||||
|
))
|
||||||
|
})?;
|
||||||
|
|
||||||
|
let filename = format!("{}.workflow.yaml", request.name);
|
||||||
|
let filepath = workflows_dir.join(&filename);
|
||||||
|
|
||||||
|
// Serialize definition to YAML
|
||||||
|
let yaml_content = serde_yaml_ng::to_string(&request.definition).map_err(|e| {
|
||||||
|
ApiError::BadRequest(format!("Failed to serialize workflow to YAML: {}", e))
|
||||||
|
})?;
|
||||||
|
|
||||||
|
// Write file
|
||||||
|
tokio::fs::write(&filepath, yaml_content)
|
||||||
|
.await
|
||||||
|
.map_err(|e| {
|
||||||
|
ApiError::InternalServerError(format!(
|
||||||
|
"Failed to write workflow file '{}': {}",
|
||||||
|
filepath.display(),
|
||||||
|
e
|
||||||
|
))
|
||||||
|
})?;
|
||||||
|
|
||||||
|
tracing::info!(
|
||||||
|
"Wrote workflow file: {} ({} bytes)",
|
||||||
|
filepath.display(),
|
||||||
|
filepath.metadata().map(|m| m.len()).unwrap_or(0)
|
||||||
|
);
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
/// Create workflow routes
|
/// Create workflow routes
|
||||||
pub fn routes() -> Router<Arc<AppState>> {
|
pub fn routes() -> Router<Arc<AppState>> {
|
||||||
Router::new()
|
Router::new()
|
||||||
@@ -350,7 +547,9 @@ pub fn routes() -> Router<Arc<AppState>> {
|
|||||||
.put(update_workflow)
|
.put(update_workflow)
|
||||||
.delete(delete_workflow),
|
.delete(delete_workflow),
|
||||||
)
|
)
|
||||||
|
.route("/workflows/{ref}/file", put(update_workflow_file))
|
||||||
.route("/packs/{pack_ref}/workflows", get(list_workflows_by_pack))
|
.route("/packs/{pack_ref}/workflows", get(list_workflows_by_pack))
|
||||||
|
.route("/packs/{pack_ref}/workflow-files", post(save_workflow_file))
|
||||||
}
|
}
|
||||||
|
|
||||||
#[cfg(test)]
|
#[cfg(test)]
|
||||||
@@ -362,4 +561,43 @@ mod tests {
|
|||||||
// Just verify the router can be constructed
|
// Just verify the router can be constructed
|
||||||
let _router = routes();
|
let _router = routes();
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_save_request_validation() {
|
||||||
|
let req = SaveWorkflowFileRequest {
|
||||||
|
name: "test_workflow".to_string(),
|
||||||
|
label: "Test Workflow".to_string(),
|
||||||
|
description: Some("A test workflow".to_string()),
|
||||||
|
version: "1.0.0".to_string(),
|
||||||
|
pack_ref: "core".to_string(),
|
||||||
|
definition: serde_json::json!({
|
||||||
|
"ref": "core.test_workflow",
|
||||||
|
"label": "Test Workflow",
|
||||||
|
"version": "1.0.0",
|
||||||
|
"tasks": [{"name": "task1", "action": "core.echo"}]
|
||||||
|
}),
|
||||||
|
param_schema: None,
|
||||||
|
out_schema: None,
|
||||||
|
tags: None,
|
||||||
|
enabled: None,
|
||||||
|
};
|
||||||
|
assert!(req.validate().is_ok());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_save_request_validation_empty_name() {
|
||||||
|
let req = SaveWorkflowFileRequest {
|
||||||
|
name: "".to_string(), // Invalid: empty
|
||||||
|
label: "Test".to_string(),
|
||||||
|
description: None,
|
||||||
|
version: "1.0.0".to_string(),
|
||||||
|
pack_ref: "core".to_string(),
|
||||||
|
definition: serde_json::json!({}),
|
||||||
|
param_schema: None,
|
||||||
|
out_schema: None,
|
||||||
|
tags: None,
|
||||||
|
enabled: None,
|
||||||
|
};
|
||||||
|
assert!(req.validate().is_err());
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,9 +1,14 @@
|
|||||||
//! Parameter validation module
|
//! Parameter validation module
|
||||||
//!
|
//!
|
||||||
//! Validates trigger and action parameters against their declared JSON schemas.
|
//! Validates trigger and action parameters against their declared schemas.
|
||||||
//! Template-aware: values containing `{{ }}` template expressions are replaced
|
//! Schemas use the flat StackStorm-style format:
|
||||||
//! with schema-appropriate placeholders before validation, so template expressions
|
//! { "param_name": { "type": "string", "required": true, "secret": true, ... }, ... }
|
||||||
//! pass type checks while literal values are still validated normally.
|
//!
|
||||||
|
//! Before validation, flat schemas are converted to standard JSON Schema so we
|
||||||
|
//! can reuse the `jsonschema` crate. Template-aware: values containing `{{ }}`
|
||||||
|
//! template expressions are replaced with schema-appropriate placeholders before
|
||||||
|
//! validation, so template expressions pass type checks while literal values are
|
||||||
|
//! still validated normally.
|
||||||
|
|
||||||
use attune_common::models::{action::Action, trigger::Trigger};
|
use attune_common::models::{action::Action, trigger::Trigger};
|
||||||
use jsonschema::Validator;
|
use jsonschema::Validator;
|
||||||
@@ -11,6 +16,68 @@ use serde_json::Value;
|
|||||||
|
|
||||||
use crate::middleware::ApiError;
|
use crate::middleware::ApiError;
|
||||||
|
|
||||||
|
/// Convert a flat StackStorm-style parameter schema into a standard JSON Schema
|
||||||
|
/// object suitable for `jsonschema::Validator`.
|
||||||
|
///
|
||||||
|
/// Input (flat):
|
||||||
|
/// ```json
|
||||||
|
/// { "url": { "type": "string", "required": true }, "timeout": { "type": "integer", "default": 30 } }
|
||||||
|
/// ```
|
||||||
|
///
|
||||||
|
/// Output (JSON Schema):
|
||||||
|
/// ```json
|
||||||
|
/// { "type": "object", "properties": { "url": { "type": "string" }, "timeout": { "type": "integer", "default": 30 } }, "required": ["url"] }
|
||||||
|
/// ```
|
||||||
|
fn flat_to_json_schema(flat: &Value) -> Value {
|
||||||
|
let Some(map) = flat.as_object() else {
|
||||||
|
// Not an object — return a permissive schema
|
||||||
|
return serde_json::json!({});
|
||||||
|
};
|
||||||
|
|
||||||
|
// If it already looks like a JSON Schema (has "type": "object" + "properties"),
|
||||||
|
// pass it through unchanged for backward tolerance.
|
||||||
|
if map.get("type").and_then(|v| v.as_str()) == Some("object") && map.contains_key("properties")
|
||||||
|
{
|
||||||
|
return flat.clone();
|
||||||
|
}
|
||||||
|
|
||||||
|
let mut properties = serde_json::Map::new();
|
||||||
|
let mut required: Vec<Value> = Vec::new();
|
||||||
|
|
||||||
|
for (key, prop_def) in map {
|
||||||
|
let Some(prop_obj) = prop_def.as_object() else {
|
||||||
|
// Skip non-object entries (shouldn't happen in valid schemas)
|
||||||
|
continue;
|
||||||
|
};
|
||||||
|
|
||||||
|
// Clone the property definition, stripping `required` and `secret`
|
||||||
|
// (they are not valid JSON Schema keywords).
|
||||||
|
let mut clean = prop_obj.clone();
|
||||||
|
let is_required = clean
|
||||||
|
.remove("required")
|
||||||
|
.and_then(|v| v.as_bool())
|
||||||
|
.unwrap_or(false);
|
||||||
|
clean.remove("secret");
|
||||||
|
// `position` is also an Attune extension, not JSON Schema
|
||||||
|
clean.remove("position");
|
||||||
|
|
||||||
|
if is_required {
|
||||||
|
required.push(Value::String(key.clone()));
|
||||||
|
}
|
||||||
|
|
||||||
|
properties.insert(key.clone(), Value::Object(clean));
|
||||||
|
}
|
||||||
|
|
||||||
|
let mut schema = serde_json::Map::new();
|
||||||
|
schema.insert("type".to_string(), Value::String("object".to_string()));
|
||||||
|
schema.insert("properties".to_string(), Value::Object(properties));
|
||||||
|
if !required.is_empty() {
|
||||||
|
schema.insert("required".to_string(), Value::Array(required));
|
||||||
|
}
|
||||||
|
|
||||||
|
Value::Object(schema)
|
||||||
|
}
|
||||||
|
|
||||||
/// Check if a JSON value is (or contains) a template expression.
|
/// Check if a JSON value is (or contains) a template expression.
|
||||||
fn is_template_expression(value: &Value) -> bool {
|
fn is_template_expression(value: &Value) -> bool {
|
||||||
match value {
|
match value {
|
||||||
@@ -100,7 +167,8 @@ fn placeholder_for_schema(property_schema: &Value) -> Value {
|
|||||||
/// schema-appropriate placeholders. Only replaces leaf values that match
|
/// schema-appropriate placeholders. Only replaces leaf values that match
|
||||||
/// `{{ ... }}`; non-template values are left untouched for normal validation.
|
/// `{{ ... }}`; non-template values are left untouched for normal validation.
|
||||||
///
|
///
|
||||||
/// `schema` should be the full JSON Schema object (with `properties`, `type`, etc).
|
/// `schema` must be a standard JSON Schema object (with `properties`, `type`, etc).
|
||||||
|
/// Call `flat_to_json_schema` first if starting from flat format.
|
||||||
fn replace_templates_with_placeholders(params: &Value, schema: &Value) -> Value {
|
fn replace_templates_with_placeholders(params: &Value, schema: &Value) -> Value {
|
||||||
match params {
|
match params {
|
||||||
Value::Object(map) => {
|
Value::Object(map) => {
|
||||||
@@ -164,17 +232,23 @@ fn replace_templates_with_placeholders(params: &Value, schema: &Value) -> Value
|
|||||||
|
|
||||||
/// Validate trigger parameters against the trigger's parameter schema.
|
/// Validate trigger parameters against the trigger's parameter schema.
|
||||||
/// Template expressions (`{{ ... }}`) are accepted for any field type.
|
/// Template expressions (`{{ ... }}`) are accepted for any field type.
|
||||||
|
///
|
||||||
|
/// The schema is expected in flat StackStorm format and is converted to
|
||||||
|
/// JSON Schema internally for validation.
|
||||||
pub fn validate_trigger_params(trigger: &Trigger, params: &Value) -> Result<(), ApiError> {
|
pub fn validate_trigger_params(trigger: &Trigger, params: &Value) -> Result<(), ApiError> {
|
||||||
// If no schema is defined, accept any parameters
|
// If no schema is defined, accept any parameters
|
||||||
let Some(schema) = &trigger.param_schema else {
|
let Some(flat_schema) = &trigger.param_schema else {
|
||||||
return Ok(());
|
return Ok(());
|
||||||
};
|
};
|
||||||
|
|
||||||
|
// Convert flat format to JSON Schema for validation
|
||||||
|
let schema = flat_to_json_schema(flat_schema);
|
||||||
|
|
||||||
// Replace template expressions with schema-appropriate placeholders
|
// Replace template expressions with schema-appropriate placeholders
|
||||||
let sanitized = replace_templates_with_placeholders(params, schema);
|
let sanitized = replace_templates_with_placeholders(params, &schema);
|
||||||
|
|
||||||
// Compile the JSON schema
|
// Compile the JSON schema
|
||||||
let compiled_schema = Validator::new(schema).map_err(|e| {
|
let compiled_schema = Validator::new(&schema).map_err(|e| {
|
||||||
ApiError::InternalServerError(format!(
|
ApiError::InternalServerError(format!(
|
||||||
"Invalid parameter schema for trigger '{}': {}",
|
"Invalid parameter schema for trigger '{}': {}",
|
||||||
trigger.r#ref, e
|
trigger.r#ref, e
|
||||||
@@ -207,17 +281,23 @@ pub fn validate_trigger_params(trigger: &Trigger, params: &Value) -> Result<(),
|
|||||||
|
|
||||||
/// Validate action parameters against the action's parameter schema.
|
/// Validate action parameters against the action's parameter schema.
|
||||||
/// Template expressions (`{{ ... }}`) are accepted for any field type.
|
/// Template expressions (`{{ ... }}`) are accepted for any field type.
|
||||||
|
///
|
||||||
|
/// The schema is expected in flat StackStorm format and is converted to
|
||||||
|
/// JSON Schema internally for validation.
|
||||||
pub fn validate_action_params(action: &Action, params: &Value) -> Result<(), ApiError> {
|
pub fn validate_action_params(action: &Action, params: &Value) -> Result<(), ApiError> {
|
||||||
// If no schema is defined, accept any parameters
|
// If no schema is defined, accept any parameters
|
||||||
let Some(schema) = &action.param_schema else {
|
let Some(flat_schema) = &action.param_schema else {
|
||||||
return Ok(());
|
return Ok(());
|
||||||
};
|
};
|
||||||
|
|
||||||
|
// Convert flat format to JSON Schema for validation
|
||||||
|
let schema = flat_to_json_schema(flat_schema);
|
||||||
|
|
||||||
// Replace template expressions with schema-appropriate placeholders
|
// Replace template expressions with schema-appropriate placeholders
|
||||||
let sanitized = replace_templates_with_placeholders(params, schema);
|
let sanitized = replace_templates_with_placeholders(params, &schema);
|
||||||
|
|
||||||
// Compile the JSON schema
|
// Compile the JSON schema
|
||||||
let compiled_schema = Validator::new(schema).map_err(|e| {
|
let compiled_schema = Validator::new(&schema).map_err(|e| {
|
||||||
ApiError::InternalServerError(format!(
|
ApiError::InternalServerError(format!(
|
||||||
"Invalid parameter schema for action '{}': {}",
|
"Invalid parameter schema for action '{}': {}",
|
||||||
action.r#ref, e
|
action.r#ref, e
|
||||||
@@ -309,15 +389,65 @@ mod tests {
|
|||||||
|
|
||||||
// ── Basic trigger validation (no templates) ──────────────────────
|
// ── Basic trigger validation (no templates) ──────────────────────
|
||||||
|
|
||||||
|
// ── flat_to_json_schema unit tests ───────────────────────────────
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_flat_to_json_schema_basic() {
|
||||||
|
let flat = json!({
|
||||||
|
"url": { "type": "string", "required": true },
|
||||||
|
"timeout": { "type": "integer", "default": 30 }
|
||||||
|
});
|
||||||
|
let result = flat_to_json_schema(&flat);
|
||||||
|
assert_eq!(result["type"], "object");
|
||||||
|
assert_eq!(result["properties"]["url"]["type"], "string");
|
||||||
|
// `required` should be stripped from individual properties
|
||||||
|
assert!(result["properties"]["url"].get("required").is_none());
|
||||||
|
assert_eq!(result["properties"]["timeout"]["default"], 30);
|
||||||
|
// Top-level required array should contain "url"
|
||||||
|
let req = result["required"].as_array().unwrap();
|
||||||
|
assert!(req.contains(&json!("url")));
|
||||||
|
assert!(!req.contains(&json!("timeout")));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_flat_to_json_schema_strips_secret_and_position() {
|
||||||
|
let flat = json!({
|
||||||
|
"token": { "type": "string", "secret": true, "position": 0, "required": true }
|
||||||
|
});
|
||||||
|
let result = flat_to_json_schema(&flat);
|
||||||
|
let token = &result["properties"]["token"];
|
||||||
|
assert!(token.get("secret").is_none());
|
||||||
|
assert!(token.get("position").is_none());
|
||||||
|
assert!(token.get("required").is_none());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_flat_to_json_schema_empty() {
|
||||||
|
let flat = json!({});
|
||||||
|
let result = flat_to_json_schema(&flat);
|
||||||
|
assert_eq!(result["type"], "object");
|
||||||
|
assert!(result.get("required").is_none());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_flat_to_json_schema_passthrough_json_schema() {
|
||||||
|
// If already JSON Schema format, pass through unchanged
|
||||||
|
let js = json!({
|
||||||
|
"type": "object",
|
||||||
|
"properties": { "x": { "type": "string" } },
|
||||||
|
"required": ["x"]
|
||||||
|
});
|
||||||
|
let result = flat_to_json_schema(&js);
|
||||||
|
assert_eq!(result, js);
|
||||||
|
}
|
||||||
|
|
||||||
|
// ── Basic trigger validation (flat format) ──────────────────────
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn test_validate_trigger_params_with_valid_params() {
|
fn test_validate_trigger_params_with_valid_params() {
|
||||||
let schema = json!({
|
let schema = json!({
|
||||||
"type": "object",
|
"unit": { "type": "string", "enum": ["seconds", "minutes", "hours"], "required": true },
|
||||||
"properties": {
|
"delta": { "type": "integer", "minimum": 1, "required": true }
|
||||||
"unit": { "type": "string", "enum": ["seconds", "minutes", "hours"] },
|
|
||||||
"delta": { "type": "integer", "minimum": 1 }
|
|
||||||
},
|
|
||||||
"required": ["unit", "delta"]
|
|
||||||
});
|
});
|
||||||
|
|
||||||
let trigger = make_trigger(Some(schema));
|
let trigger = make_trigger(Some(schema));
|
||||||
@@ -328,12 +458,8 @@ mod tests {
|
|||||||
#[test]
|
#[test]
|
||||||
fn test_validate_trigger_params_with_invalid_params() {
|
fn test_validate_trigger_params_with_invalid_params() {
|
||||||
let schema = json!({
|
let schema = json!({
|
||||||
"type": "object",
|
"unit": { "type": "string", "enum": ["seconds", "minutes", "hours"], "required": true },
|
||||||
"properties": {
|
"delta": { "type": "integer", "minimum": 1, "required": true }
|
||||||
"unit": { "type": "string", "enum": ["seconds", "minutes", "hours"] },
|
|
||||||
"delta": { "type": "integer", "minimum": 1 }
|
|
||||||
},
|
|
||||||
"required": ["unit", "delta"]
|
|
||||||
});
|
});
|
||||||
|
|
||||||
let trigger = make_trigger(Some(schema));
|
let trigger = make_trigger(Some(schema));
|
||||||
@@ -351,16 +477,12 @@ mod tests {
|
|||||||
assert!(validate_trigger_params(&trigger, ¶ms).is_err());
|
assert!(validate_trigger_params(&trigger, ¶ms).is_err());
|
||||||
}
|
}
|
||||||
|
|
||||||
// ── Basic action validation (no templates) ───────────────────────
|
// ── Basic action validation (flat format) ───────────────────────
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn test_validate_action_params_with_valid_params() {
|
fn test_validate_action_params_with_valid_params() {
|
||||||
let schema = json!({
|
let schema = json!({
|
||||||
"type": "object",
|
"message": { "type": "string", "required": true }
|
||||||
"properties": {
|
|
||||||
"message": { "type": "string" }
|
|
||||||
},
|
|
||||||
"required": ["message"]
|
|
||||||
});
|
});
|
||||||
|
|
||||||
let action = make_action(Some(schema));
|
let action = make_action(Some(schema));
|
||||||
@@ -371,11 +493,7 @@ mod tests {
|
|||||||
#[test]
|
#[test]
|
||||||
fn test_validate_action_params_with_empty_params_but_required_fields() {
|
fn test_validate_action_params_with_empty_params_but_required_fields() {
|
||||||
let schema = json!({
|
let schema = json!({
|
||||||
"type": "object",
|
"message": { "type": "string", "required": true }
|
||||||
"properties": {
|
|
||||||
"message": { "type": "string" }
|
|
||||||
},
|
|
||||||
"required": ["message"]
|
|
||||||
});
|
});
|
||||||
|
|
||||||
let action = make_action(Some(schema));
|
let action = make_action(Some(schema));
|
||||||
@@ -383,16 +501,12 @@ mod tests {
|
|||||||
assert!(validate_action_params(&action, ¶ms).is_err());
|
assert!(validate_action_params(&action, ¶ms).is_err());
|
||||||
}
|
}
|
||||||
|
|
||||||
// ── Template-aware validation ────────────────────────────────────
|
// ── Template-aware validation (flat format) ──────────────────────
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn test_template_in_integer_field_passes() {
|
fn test_template_in_integer_field_passes() {
|
||||||
let schema = json!({
|
let schema = json!({
|
||||||
"type": "object",
|
"counter": { "type": "integer", "required": true }
|
||||||
"properties": {
|
|
||||||
"counter": { "type": "integer" }
|
|
||||||
},
|
|
||||||
"required": ["counter"]
|
|
||||||
});
|
});
|
||||||
|
|
||||||
let action = make_action(Some(schema));
|
let action = make_action(Some(schema));
|
||||||
@@ -403,11 +517,7 @@ mod tests {
|
|||||||
#[test]
|
#[test]
|
||||||
fn test_template_in_boolean_field_passes() {
|
fn test_template_in_boolean_field_passes() {
|
||||||
let schema = json!({
|
let schema = json!({
|
||||||
"type": "object",
|
"verbose": { "type": "boolean", "required": true }
|
||||||
"properties": {
|
|
||||||
"verbose": { "type": "boolean" }
|
|
||||||
},
|
|
||||||
"required": ["verbose"]
|
|
||||||
});
|
});
|
||||||
|
|
||||||
let action = make_action(Some(schema));
|
let action = make_action(Some(schema));
|
||||||
@@ -418,11 +528,7 @@ mod tests {
|
|||||||
#[test]
|
#[test]
|
||||||
fn test_template_in_number_field_passes() {
|
fn test_template_in_number_field_passes() {
|
||||||
let schema = json!({
|
let schema = json!({
|
||||||
"type": "object",
|
"threshold": { "type": "number", "minimum": 0.0, "required": true }
|
||||||
"properties": {
|
|
||||||
"threshold": { "type": "number", "minimum": 0.0 }
|
|
||||||
},
|
|
||||||
"required": ["threshold"]
|
|
||||||
});
|
});
|
||||||
|
|
||||||
let action = make_action(Some(schema));
|
let action = make_action(Some(schema));
|
||||||
@@ -433,11 +539,7 @@ mod tests {
|
|||||||
#[test]
|
#[test]
|
||||||
fn test_template_in_enum_field_passes() {
|
fn test_template_in_enum_field_passes() {
|
||||||
let schema = json!({
|
let schema = json!({
|
||||||
"type": "object",
|
"level": { "type": "string", "enum": ["info", "warn", "error"], "required": true }
|
||||||
"properties": {
|
|
||||||
"level": { "type": "string", "enum": ["info", "warn", "error"] }
|
|
||||||
},
|
|
||||||
"required": ["level"]
|
|
||||||
});
|
});
|
||||||
|
|
||||||
let action = make_action(Some(schema));
|
let action = make_action(Some(schema));
|
||||||
@@ -448,11 +550,7 @@ mod tests {
|
|||||||
#[test]
|
#[test]
|
||||||
fn test_template_in_array_field_passes() {
|
fn test_template_in_array_field_passes() {
|
||||||
let schema = json!({
|
let schema = json!({
|
||||||
"type": "object",
|
"recipients": { "type": "array", "items": { "type": "string" }, "required": true }
|
||||||
"properties": {
|
|
||||||
"recipients": { "type": "array", "items": { "type": "string" } }
|
|
||||||
},
|
|
||||||
"required": ["recipients"]
|
|
||||||
});
|
});
|
||||||
|
|
||||||
let action = make_action(Some(schema));
|
let action = make_action(Some(schema));
|
||||||
@@ -463,11 +561,7 @@ mod tests {
|
|||||||
#[test]
|
#[test]
|
||||||
fn test_template_in_object_field_passes() {
|
fn test_template_in_object_field_passes() {
|
||||||
let schema = json!({
|
let schema = json!({
|
||||||
"type": "object",
|
"metadata": { "type": "object", "required": true }
|
||||||
"properties": {
|
|
||||||
"metadata": { "type": "object" }
|
|
||||||
},
|
|
||||||
"required": ["metadata"]
|
|
||||||
});
|
});
|
||||||
|
|
||||||
let action = make_action(Some(schema));
|
let action = make_action(Some(schema));
|
||||||
@@ -478,13 +572,9 @@ mod tests {
|
|||||||
#[test]
|
#[test]
|
||||||
fn test_mixed_template_and_literal_values() {
|
fn test_mixed_template_and_literal_values() {
|
||||||
let schema = json!({
|
let schema = json!({
|
||||||
"type": "object",
|
"message": { "type": "string", "required": true },
|
||||||
"properties": {
|
"count": { "type": "integer", "required": true },
|
||||||
"message": { "type": "string" },
|
"verbose": { "type": "boolean", "required": true }
|
||||||
"count": { "type": "integer" },
|
|
||||||
"verbose": { "type": "boolean" }
|
|
||||||
},
|
|
||||||
"required": ["message", "count", "verbose"]
|
|
||||||
});
|
});
|
||||||
|
|
||||||
let action = make_action(Some(schema));
|
let action = make_action(Some(schema));
|
||||||
@@ -498,6 +588,26 @@ mod tests {
|
|||||||
assert!(validate_action_params(&action, ¶ms).is_ok());
|
assert!(validate_action_params(&action, ¶ms).is_ok());
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// ── Secret fields are ignored during validation ──────────────────
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_secret_field_validated_normally() {
|
||||||
|
let schema = json!({
|
||||||
|
"api_key": { "type": "string", "required": true, "secret": true },
|
||||||
|
"endpoint": { "type": "string" }
|
||||||
|
});
|
||||||
|
|
||||||
|
let action = make_action(Some(schema));
|
||||||
|
|
||||||
|
// Valid: secret field provided
|
||||||
|
let params = json!({ "api_key": "sk-1234", "endpoint": "https://api.example.com" });
|
||||||
|
assert!(validate_action_params(&action, ¶ms).is_ok());
|
||||||
|
|
||||||
|
// Invalid: secret field missing but required
|
||||||
|
let params = json!({ "endpoint": "https://api.example.com" });
|
||||||
|
assert!(validate_action_params(&action, ¶ms).is_err());
|
||||||
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn test_literal_values_still_validated() {
|
fn test_literal_values_still_validated() {
|
||||||
let schema = json!({
|
let schema = json!({
|
||||||
|
|||||||
@@ -8,6 +8,26 @@ use sqlx::{Executor, Postgres, QueryBuilder};
|
|||||||
|
|
||||||
use super::{Create, Delete, FindById, FindByRef, List, Repository, Update};
|
use super::{Create, Delete, FindById, FindByRef, List, Repository, Update};
|
||||||
|
|
||||||
|
/// Input for restoring an ad-hoc rule during pack reinstallation.
|
||||||
|
/// Unlike `CreateRuleInput`, action and trigger IDs are optional because
|
||||||
|
/// the referenced entities may not exist yet or may have been removed.
|
||||||
|
#[derive(Debug, Clone)]
|
||||||
|
pub struct RestoreRuleInput {
|
||||||
|
pub r#ref: String,
|
||||||
|
pub pack: Id,
|
||||||
|
pub pack_ref: String,
|
||||||
|
pub label: String,
|
||||||
|
pub description: String,
|
||||||
|
pub action: Option<Id>,
|
||||||
|
pub action_ref: String,
|
||||||
|
pub trigger: Option<Id>,
|
||||||
|
pub trigger_ref: String,
|
||||||
|
pub conditions: serde_json::Value,
|
||||||
|
pub action_params: serde_json::Value,
|
||||||
|
pub trigger_params: serde_json::Value,
|
||||||
|
pub enabled: bool,
|
||||||
|
}
|
||||||
|
|
||||||
/// Repository for Rule operations
|
/// Repository for Rule operations
|
||||||
pub struct RuleRepository;
|
pub struct RuleRepository;
|
||||||
|
|
||||||
@@ -337,4 +357,121 @@ impl RuleRepository {
|
|||||||
|
|
||||||
Ok(rules)
|
Ok(rules)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Find ad-hoc (user-created) rules belonging to a specific pack.
|
||||||
|
/// Used to preserve custom rules during pack reinstallation.
|
||||||
|
pub async fn find_adhoc_by_pack<'e, E>(executor: E, pack_id: Id) -> Result<Vec<Rule>>
|
||||||
|
where
|
||||||
|
E: Executor<'e, Database = Postgres> + 'e,
|
||||||
|
{
|
||||||
|
let rules = sqlx::query_as::<_, Rule>(
|
||||||
|
r#"
|
||||||
|
SELECT id, ref, pack, pack_ref, label, description, action, action_ref,
|
||||||
|
trigger, trigger_ref, conditions, action_params, trigger_params, enabled, is_adhoc, created, updated
|
||||||
|
FROM rule
|
||||||
|
WHERE pack = $1 AND is_adhoc = true
|
||||||
|
ORDER BY ref ASC
|
||||||
|
"#,
|
||||||
|
)
|
||||||
|
.bind(pack_id)
|
||||||
|
.fetch_all(executor)
|
||||||
|
.await?;
|
||||||
|
|
||||||
|
Ok(rules)
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Restore an ad-hoc rule after pack reinstallation.
|
||||||
|
/// Accepts `Option<Id>` for action and trigger so the rule is preserved
|
||||||
|
/// even if its referenced entities no longer exist.
|
||||||
|
pub async fn restore_rule<'e, E>(executor: E, input: RestoreRuleInput) -> Result<Rule>
|
||||||
|
where
|
||||||
|
E: Executor<'e, Database = Postgres> + 'e,
|
||||||
|
{
|
||||||
|
let rule = sqlx::query_as::<_, Rule>(
|
||||||
|
r#"
|
||||||
|
INSERT INTO rule (ref, pack, pack_ref, label, description, action, action_ref,
|
||||||
|
trigger, trigger_ref, conditions, action_params, trigger_params, enabled, is_adhoc)
|
||||||
|
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, true)
|
||||||
|
RETURNING id, ref, pack, pack_ref, label, description, action, action_ref,
|
||||||
|
trigger, trigger_ref, conditions, action_params, trigger_params, enabled, is_adhoc, created, updated
|
||||||
|
"#,
|
||||||
|
)
|
||||||
|
.bind(&input.r#ref)
|
||||||
|
.bind(input.pack)
|
||||||
|
.bind(&input.pack_ref)
|
||||||
|
.bind(&input.label)
|
||||||
|
.bind(&input.description)
|
||||||
|
.bind(input.action)
|
||||||
|
.bind(&input.action_ref)
|
||||||
|
.bind(input.trigger)
|
||||||
|
.bind(&input.trigger_ref)
|
||||||
|
.bind(&input.conditions)
|
||||||
|
.bind(&input.action_params)
|
||||||
|
.bind(&input.trigger_params)
|
||||||
|
.bind(input.enabled)
|
||||||
|
.fetch_one(executor)
|
||||||
|
.await
|
||||||
|
.map_err(|e| {
|
||||||
|
if let sqlx::Error::Database(ref db_err) = e {
|
||||||
|
if db_err.is_unique_violation() {
|
||||||
|
return Error::already_exists("Rule", "ref", &input.r#ref);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
e.into()
|
||||||
|
})?;
|
||||||
|
|
||||||
|
Ok(rule)
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Re-link rules whose action FK is NULL back to a newly recreated action,
|
||||||
|
/// matched by `action_ref`. Used after pack reinstallation to fix rules
|
||||||
|
/// from other packs that referenced actions in the reinstalled pack.
|
||||||
|
pub async fn relink_action_by_ref<'e, E>(
|
||||||
|
executor: E,
|
||||||
|
action_ref: &str,
|
||||||
|
action_id: Id,
|
||||||
|
) -> Result<u64>
|
||||||
|
where
|
||||||
|
E: Executor<'e, Database = Postgres> + 'e,
|
||||||
|
{
|
||||||
|
let result = sqlx::query(
|
||||||
|
r#"
|
||||||
|
UPDATE rule
|
||||||
|
SET action = $1, updated = NOW()
|
||||||
|
WHERE action IS NULL AND action_ref = $2
|
||||||
|
"#,
|
||||||
|
)
|
||||||
|
.bind(action_id)
|
||||||
|
.bind(action_ref)
|
||||||
|
.execute(executor)
|
||||||
|
.await?;
|
||||||
|
|
||||||
|
Ok(result.rows_affected())
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Re-link rules whose trigger FK is NULL back to a newly recreated trigger,
|
||||||
|
/// matched by `trigger_ref`. Used after pack reinstallation to fix rules
|
||||||
|
/// from other packs that referenced triggers in the reinstalled pack.
|
||||||
|
pub async fn relink_trigger_by_ref<'e, E>(
|
||||||
|
executor: E,
|
||||||
|
trigger_ref: &str,
|
||||||
|
trigger_id: Id,
|
||||||
|
) -> Result<u64>
|
||||||
|
where
|
||||||
|
E: Executor<'e, Database = Postgres> + 'e,
|
||||||
|
{
|
||||||
|
let result = sqlx::query(
|
||||||
|
r#"
|
||||||
|
UPDATE rule
|
||||||
|
SET trigger = $1, updated = NOW()
|
||||||
|
WHERE trigger IS NULL AND trigger_ref = $2
|
||||||
|
"#,
|
||||||
|
)
|
||||||
|
.bind(trigger_id)
|
||||||
|
.bind(trigger_ref)
|
||||||
|
.execute(executor)
|
||||||
|
.await?;
|
||||||
|
|
||||||
|
Ok(result.rows_affected())
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -15,7 +15,8 @@ pub use pack_service::{
|
|||||||
};
|
};
|
||||||
pub use parser::{
|
pub use parser::{
|
||||||
parse_workflow_file, parse_workflow_yaml, workflow_to_json, BackoffStrategy, DecisionBranch,
|
parse_workflow_file, parse_workflow_yaml, workflow_to_json, BackoffStrategy, DecisionBranch,
|
||||||
ParseError, ParseResult, PublishDirective, RetryConfig, Task, TaskType, WorkflowDefinition,
|
ParseError, ParseResult, PublishDirective, RetryConfig, Task, TaskTransition, TaskType,
|
||||||
|
WorkflowDefinition,
|
||||||
};
|
};
|
||||||
pub use registrar::{RegistrationOptions, RegistrationResult, WorkflowRegistrar};
|
pub use registrar::{RegistrationOptions, RegistrationResult, WorkflowRegistrar};
|
||||||
pub use validator::{ValidationError, ValidationResult, WorkflowValidator};
|
pub use validator::{ValidationError, ValidationResult, WorkflowValidator};
|
||||||
|
|||||||
@@ -2,6 +2,38 @@
|
|||||||
//!
|
//!
|
||||||
//! This module handles parsing workflow YAML files into structured Rust types
|
//! This module handles parsing workflow YAML files into structured Rust types
|
||||||
//! that can be validated and stored in the database.
|
//! that can be validated and stored in the database.
|
||||||
|
//!
|
||||||
|
//! Supports two task transition formats:
|
||||||
|
//!
|
||||||
|
//! **New format (Orquesta-style `next` array):**
|
||||||
|
//! ```yaml
|
||||||
|
//! tasks:
|
||||||
|
//! - name: task1
|
||||||
|
//! action: core.echo
|
||||||
|
//! next:
|
||||||
|
//! - when: "{{ succeeded() }}"
|
||||||
|
//! publish:
|
||||||
|
//! - result: "{{ result() }}"
|
||||||
|
//! do:
|
||||||
|
//! - task2
|
||||||
|
//! - log
|
||||||
|
//! - when: "{{ failed() }}"
|
||||||
|
//! do:
|
||||||
|
//! - error_handler
|
||||||
|
//! ```
|
||||||
|
//!
|
||||||
|
//! **Legacy format (flat fields):**
|
||||||
|
//! ```yaml
|
||||||
|
//! tasks:
|
||||||
|
//! - name: task1
|
||||||
|
//! action: core.echo
|
||||||
|
//! on_success: task2
|
||||||
|
//! on_failure: error_handler
|
||||||
|
//! ```
|
||||||
|
//!
|
||||||
|
//! When legacy fields are present, they are automatically converted to `next`
|
||||||
|
//! transitions during parsing. The canonical internal representation always
|
||||||
|
//! uses the `next` array.
|
||||||
|
|
||||||
use serde::{Deserialize, Serialize};
|
use serde::{Deserialize, Serialize};
|
||||||
use serde_json::Value as JsonValue;
|
use serde_json::Value as JsonValue;
|
||||||
@@ -85,7 +117,40 @@ pub struct WorkflowDefinition {
|
|||||||
pub tags: Vec<String>,
|
pub tags: Vec<String>,
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Task definition - can be action, parallel, or workflow type
|
// ---------------------------------------------------------------------------
|
||||||
|
// Task transition types (Orquesta-style)
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
/// A single task transition evaluated after task completion.
|
||||||
|
///
|
||||||
|
/// Transitions are evaluated in order. When `when` is not defined,
|
||||||
|
/// the transition is unconditional (fires on any completion).
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct TaskTransition {
|
||||||
|
/// Condition expression (e.g., "{{ succeeded() }}", "{{ failed() }}")
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub when: Option<String>,
|
||||||
|
|
||||||
|
/// Variables to publish into the workflow context on this transition
|
||||||
|
#[serde(default, skip_serializing_if = "Vec::is_empty")]
|
||||||
|
pub publish: Vec<PublishDirective>,
|
||||||
|
|
||||||
|
/// Next tasks to invoke when transition criteria is met
|
||||||
|
#[serde(default, skip_serializing_if = "Option::is_none")]
|
||||||
|
pub r#do: Option<Vec<String>>,
|
||||||
|
}
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// Task definition
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
/// Task definition - can be action, parallel, or workflow type.
|
||||||
|
///
|
||||||
|
/// Supports both the new `next` transition format and legacy flat fields
|
||||||
|
/// (`on_success`, `on_failure`, etc.) for backward compatibility. During
|
||||||
|
/// deserialization the legacy fields are captured; call
|
||||||
|
/// [`Task::normalize_transitions`] (done automatically during parsing) to
|
||||||
|
/// merge them into the canonical `next` array.
|
||||||
#[derive(Debug, Clone, Serialize, Deserialize, Validate)]
|
#[derive(Debug, Clone, Serialize, Deserialize, Validate)]
|
||||||
pub struct Task {
|
pub struct Task {
|
||||||
/// Unique task name within the workflow
|
/// Unique task name within the workflow
|
||||||
@@ -103,7 +168,7 @@ pub struct Task {
|
|||||||
#[serde(default)]
|
#[serde(default)]
|
||||||
pub input: HashMap<String, JsonValue>,
|
pub input: HashMap<String, JsonValue>,
|
||||||
|
|
||||||
/// Conditional execution
|
/// Conditional execution (task-level — controls whether this task runs)
|
||||||
pub when: Option<String>,
|
pub when: Option<String>,
|
||||||
|
|
||||||
/// With-items iteration
|
/// With-items iteration
|
||||||
@@ -115,41 +180,195 @@ pub struct Task {
|
|||||||
/// Concurrency limit for with-items
|
/// Concurrency limit for with-items
|
||||||
pub concurrency: Option<usize>,
|
pub concurrency: Option<usize>,
|
||||||
|
|
||||||
/// Variable publishing
|
|
||||||
#[serde(default)]
|
|
||||||
pub publish: Vec<PublishDirective>,
|
|
||||||
|
|
||||||
/// Retry configuration
|
/// Retry configuration
|
||||||
pub retry: Option<RetryConfig>,
|
pub retry: Option<RetryConfig>,
|
||||||
|
|
||||||
/// Timeout in seconds
|
/// Timeout in seconds
|
||||||
pub timeout: Option<u32>,
|
pub timeout: Option<u32>,
|
||||||
|
|
||||||
/// Transition on success
|
/// Orquesta-style transitions — the canonical representation.
|
||||||
|
/// Each entry can specify a `when` condition, `publish` directives,
|
||||||
|
/// and a list of next tasks (`do`).
|
||||||
|
#[serde(default)]
|
||||||
|
pub next: Vec<TaskTransition>,
|
||||||
|
|
||||||
|
// -- Legacy transition fields (read during deserialization) -------------
|
||||||
|
// These are kept for backward compatibility with older workflow YAML
|
||||||
|
// files. During [`normalize_transitions`] they are folded into `next`.
|
||||||
|
/// Legacy: transition on success
|
||||||
|
#[serde(default, skip_serializing_if = "Option::is_none")]
|
||||||
pub on_success: Option<String>,
|
pub on_success: Option<String>,
|
||||||
|
|
||||||
/// Transition on failure
|
/// Legacy: transition on failure
|
||||||
|
#[serde(default, skip_serializing_if = "Option::is_none")]
|
||||||
pub on_failure: Option<String>,
|
pub on_failure: Option<String>,
|
||||||
|
|
||||||
/// Transition on complete (regardless of status)
|
/// Legacy: transition on complete (regardless of status)
|
||||||
|
#[serde(default, skip_serializing_if = "Option::is_none")]
|
||||||
pub on_complete: Option<String>,
|
pub on_complete: Option<String>,
|
||||||
|
|
||||||
/// Transition on timeout
|
/// Legacy: transition on timeout
|
||||||
|
#[serde(default, skip_serializing_if = "Option::is_none")]
|
||||||
pub on_timeout: Option<String>,
|
pub on_timeout: Option<String>,
|
||||||
|
|
||||||
/// Decision-based transitions
|
/// Legacy: decision-based transitions
|
||||||
#[serde(default)]
|
#[serde(default, skip_serializing_if = "Vec::is_empty")]
|
||||||
pub decision: Vec<DecisionBranch>,
|
pub decision: Vec<DecisionBranch>,
|
||||||
|
|
||||||
/// Join barrier - wait for N inbound tasks to complete before executing
|
/// Legacy: task-level variable publishing (moved to per-transition in new model)
|
||||||
/// If not specified, task executes immediately when any predecessor completes
|
#[serde(default, skip_serializing_if = "Vec::is_empty")]
|
||||||
/// Special value "all" can be represented as the count of inbound edges
|
pub publish: Vec<PublishDirective>,
|
||||||
|
|
||||||
|
/// Join barrier - wait for N inbound tasks to complete before executing.
|
||||||
|
/// If not specified, task executes immediately when any predecessor completes.
|
||||||
|
/// Special value "all" can be represented as the count of inbound edges.
|
||||||
pub join: Option<usize>,
|
pub join: Option<usize>,
|
||||||
|
|
||||||
/// Parallel tasks (for parallel type)
|
/// Parallel tasks (for parallel type)
|
||||||
pub tasks: Option<Vec<Task>>,
|
pub tasks: Option<Vec<Task>>,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
impl Task {
|
||||||
|
/// Returns `true` if any legacy transition fields are populated.
|
||||||
|
fn has_legacy_transitions(&self) -> bool {
|
||||||
|
self.on_success.is_some()
|
||||||
|
|| self.on_failure.is_some()
|
||||||
|
|| self.on_complete.is_some()
|
||||||
|
|| self.on_timeout.is_some()
|
||||||
|
|| !self.decision.is_empty()
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Convert legacy flat transition fields into the `next` array.
|
||||||
|
///
|
||||||
|
/// If `next` is already populated, legacy fields are ignored (the new
|
||||||
|
/// format takes precedence). After normalization the legacy fields are
|
||||||
|
/// cleared so serialization only emits the canonical `next` form.
|
||||||
|
pub fn normalize_transitions(&mut self) {
|
||||||
|
// If `next` is already populated, the new format wins — clear legacy
|
||||||
|
if !self.next.is_empty() {
|
||||||
|
self.clear_legacy_fields();
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Nothing to convert
|
||||||
|
if !self.has_legacy_transitions() && self.publish.is_empty() {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
let mut transitions: Vec<TaskTransition> = Vec::new();
|
||||||
|
|
||||||
|
if let Some(ref target) = self.on_success {
|
||||||
|
transitions.push(TaskTransition {
|
||||||
|
when: Some("{{ succeeded() }}".to_string()),
|
||||||
|
publish: Vec::new(),
|
||||||
|
r#do: Some(vec![target.clone()]),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
if let Some(ref target) = self.on_failure {
|
||||||
|
transitions.push(TaskTransition {
|
||||||
|
when: Some("{{ failed() }}".to_string()),
|
||||||
|
publish: Vec::new(),
|
||||||
|
r#do: Some(vec![target.clone()]),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
if let Some(ref target) = self.on_complete {
|
||||||
|
// on_complete = unconditional
|
||||||
|
transitions.push(TaskTransition {
|
||||||
|
when: None,
|
||||||
|
publish: Vec::new(),
|
||||||
|
r#do: Some(vec![target.clone()]),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
if let Some(ref target) = self.on_timeout {
|
||||||
|
transitions.push(TaskTransition {
|
||||||
|
when: Some("{{ timed_out() }}".to_string()),
|
||||||
|
publish: Vec::new(),
|
||||||
|
r#do: Some(vec![target.clone()]),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Convert legacy decision branches
|
||||||
|
for branch in &self.decision {
|
||||||
|
transitions.push(TaskTransition {
|
||||||
|
when: branch.when.clone(),
|
||||||
|
publish: Vec::new(),
|
||||||
|
r#do: Some(vec![branch.next.clone()]),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Attach legacy task-level publish to the first succeeded transition,
|
||||||
|
// or create a publish-only transition if none exist
|
||||||
|
if !self.publish.is_empty() {
|
||||||
|
let succeeded_idx = transitions
|
||||||
|
.iter()
|
||||||
|
.position(|t| matches!(&t.when, Some(w) if w.contains("succeeded()")));
|
||||||
|
|
||||||
|
if let Some(idx) = succeeded_idx {
|
||||||
|
transitions[idx].publish = self.publish.clone();
|
||||||
|
} else if transitions.is_empty() {
|
||||||
|
transitions.push(TaskTransition {
|
||||||
|
when: Some("{{ succeeded() }}".to_string()),
|
||||||
|
publish: self.publish.clone(),
|
||||||
|
r#do: None,
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
// Attach to the first transition
|
||||||
|
transitions[0].publish = self.publish.clone();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
self.next = transitions;
|
||||||
|
self.clear_legacy_fields();
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Clear legacy transition fields after normalization
|
||||||
|
fn clear_legacy_fields(&mut self) {
|
||||||
|
self.on_success = None;
|
||||||
|
self.on_failure = None;
|
||||||
|
self.on_complete = None;
|
||||||
|
self.on_timeout = None;
|
||||||
|
self.decision.clear();
|
||||||
|
self.publish.clear();
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Collect all task names referenced by transitions (both `next` and legacy).
|
||||||
|
/// Used for validation.
|
||||||
|
pub fn all_transition_targets(&self) -> Vec<&str> {
|
||||||
|
let mut targets: Vec<&str> = Vec::new();
|
||||||
|
|
||||||
|
// From `next` array
|
||||||
|
for transition in &self.next {
|
||||||
|
if let Some(ref do_list) = transition.r#do {
|
||||||
|
for target in do_list {
|
||||||
|
targets.push(target.as_str());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// From legacy fields (in case normalize hasn't been called yet)
|
||||||
|
if let Some(ref t) = self.on_success {
|
||||||
|
targets.push(t.as_str());
|
||||||
|
}
|
||||||
|
if let Some(ref t) = self.on_failure {
|
||||||
|
targets.push(t.as_str());
|
||||||
|
}
|
||||||
|
if let Some(ref t) = self.on_complete {
|
||||||
|
targets.push(t.as_str());
|
||||||
|
}
|
||||||
|
if let Some(ref t) = self.on_timeout {
|
||||||
|
targets.push(t.as_str());
|
||||||
|
}
|
||||||
|
for branch in &self.decision {
|
||||||
|
targets.push(branch.next.as_str());
|
||||||
|
}
|
||||||
|
|
||||||
|
targets
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
fn default_task_type() -> TaskType {
|
fn default_task_type() -> TaskType {
|
||||||
TaskType::Action
|
TaskType::Action
|
||||||
}
|
}
|
||||||
@@ -214,7 +433,7 @@ pub enum BackoffStrategy {
|
|||||||
Exponential,
|
Exponential,
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Decision-based transition
|
/// Legacy decision-based transition (kept for backward compatibility)
|
||||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
pub struct DecisionBranch {
|
pub struct DecisionBranch {
|
||||||
/// Condition to evaluate (template string)
|
/// Condition to evaluate (template string)
|
||||||
@@ -228,10 +447,17 @@ pub struct DecisionBranch {
|
|||||||
pub default: bool,
|
pub default: bool,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// Parsing & validation
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
/// Parse workflow YAML string into WorkflowDefinition
|
/// Parse workflow YAML string into WorkflowDefinition
|
||||||
pub fn parse_workflow_yaml(yaml: &str) -> ParseResult<WorkflowDefinition> {
|
pub fn parse_workflow_yaml(yaml: &str) -> ParseResult<WorkflowDefinition> {
|
||||||
// Parse YAML
|
// Parse YAML
|
||||||
let workflow: WorkflowDefinition = serde_yaml_ng::from_str(yaml)?;
|
let mut workflow: WorkflowDefinition = serde_yaml_ng::from_str(yaml)?;
|
||||||
|
|
||||||
|
// Normalize legacy transitions into `next` arrays
|
||||||
|
normalize_all_transitions(&mut workflow);
|
||||||
|
|
||||||
// Validate structure
|
// Validate structure
|
||||||
workflow.validate()?;
|
workflow.validate()?;
|
||||||
@@ -249,6 +475,19 @@ pub fn parse_workflow_file(path: &std::path::Path) -> ParseResult<WorkflowDefini
|
|||||||
parse_workflow_yaml(&contents)
|
parse_workflow_yaml(&contents)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Normalize all tasks in a workflow definition, converting legacy fields to `next`.
|
||||||
|
fn normalize_all_transitions(workflow: &mut WorkflowDefinition) {
|
||||||
|
for task in &mut workflow.tasks {
|
||||||
|
task.normalize_transitions();
|
||||||
|
// Recursively normalize sub-tasks (parallel)
|
||||||
|
if let Some(ref mut sub_tasks) = task.tasks {
|
||||||
|
for sub in sub_tasks {
|
||||||
|
sub.normalize_transitions();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
/// Validate workflow structure and references
|
/// Validate workflow structure and references
|
||||||
fn validate_workflow_structure(workflow: &WorkflowDefinition) -> ParseResult<()> {
|
fn validate_workflow_structure(workflow: &WorkflowDefinition) -> ParseResult<()> {
|
||||||
// Collect all task names
|
// Collect all task names
|
||||||
@@ -294,30 +533,12 @@ fn validate_task(task: &Task, task_names: &std::collections::HashSet<&str>) -> P
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Validate transitions reference existing tasks
|
// Validate all transition targets reference existing tasks
|
||||||
for transition in [
|
for target in task.all_transition_targets() {
|
||||||
&task.on_success,
|
if !task_names.contains(target) {
|
||||||
&task.on_failure,
|
|
||||||
&task.on_complete,
|
|
||||||
&task.on_timeout,
|
|
||||||
]
|
|
||||||
.iter()
|
|
||||||
.filter_map(|t| t.as_ref())
|
|
||||||
{
|
|
||||||
if !task_names.contains(transition.as_str()) {
|
|
||||||
return Err(ParseError::InvalidTaskReference(format!(
|
return Err(ParseError::InvalidTaskReference(format!(
|
||||||
"Task '{}' references non-existent task '{}'",
|
"Task '{}' references non-existent task '{}'",
|
||||||
task.name, transition
|
task.name, target
|
||||||
)));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Validate decision branches
|
|
||||||
for branch in &task.decision {
|
|
||||||
if !task_names.contains(branch.next.as_str()) {
|
|
||||||
return Err(ParseError::InvalidTaskReference(format!(
|
|
||||||
"Task '{}' decision branch references non-existent task '{}'",
|
|
||||||
task.name, branch.next
|
|
||||||
)));
|
)));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -352,8 +573,12 @@ pub fn workflow_to_json(workflow: &WorkflowDefinition) -> Result<JsonValue, serd
|
|||||||
mod tests {
|
mod tests {
|
||||||
use super::*;
|
use super::*;
|
||||||
|
|
||||||
|
// -----------------------------------------------------------------------
|
||||||
|
// Legacy format tests (backward compatibility)
|
||||||
|
// -----------------------------------------------------------------------
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn test_parse_simple_workflow() {
|
fn test_parse_simple_workflow_legacy() {
|
||||||
let yaml = r#"
|
let yaml = r#"
|
||||||
ref: test.simple_workflow
|
ref: test.simple_workflow
|
||||||
label: Simple Workflow
|
label: Simple Workflow
|
||||||
@@ -371,15 +596,26 @@ tasks:
|
|||||||
"#;
|
"#;
|
||||||
|
|
||||||
let result = parse_workflow_yaml(yaml);
|
let result = parse_workflow_yaml(yaml);
|
||||||
assert!(result.is_ok());
|
assert!(result.is_ok(), "Parse failed: {:?}", result.err());
|
||||||
let workflow = result.unwrap();
|
let workflow = result.unwrap();
|
||||||
assert_eq!(workflow.tasks.len(), 2);
|
assert_eq!(workflow.tasks.len(), 2);
|
||||||
assert_eq!(workflow.tasks[0].name, "task1");
|
assert_eq!(workflow.tasks[0].name, "task1");
|
||||||
|
|
||||||
|
// Legacy on_success should have been normalized into `next`
|
||||||
|
assert!(workflow.tasks[0].on_success.is_none());
|
||||||
|
assert_eq!(workflow.tasks[0].next.len(), 1);
|
||||||
|
assert_eq!(
|
||||||
|
workflow.tasks[0].next[0].when.as_deref(),
|
||||||
|
Some("{{ succeeded() }}")
|
||||||
|
);
|
||||||
|
assert_eq!(
|
||||||
|
workflow.tasks[0].next[0].r#do,
|
||||||
|
Some(vec!["task2".to_string()])
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn test_cycles_now_allowed() {
|
fn test_cycles_now_allowed_legacy() {
|
||||||
// After Orquesta-style refactoring, cycles are now supported
|
|
||||||
let yaml = r#"
|
let yaml = r#"
|
||||||
ref: test.circular
|
ref: test.circular
|
||||||
label: Circular Workflow (Now Allowed)
|
label: Circular Workflow (Now Allowed)
|
||||||
@@ -403,7 +639,7 @@ tasks:
|
|||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn test_invalid_task_reference() {
|
fn test_invalid_task_reference_legacy() {
|
||||||
let yaml = r#"
|
let yaml = r#"
|
||||||
ref: test.invalid_ref
|
ref: test.invalid_ref
|
||||||
label: Invalid Reference
|
label: Invalid Reference
|
||||||
@@ -418,12 +654,12 @@ tasks:
|
|||||||
assert!(result.is_err());
|
assert!(result.is_err());
|
||||||
match result {
|
match result {
|
||||||
Err(ParseError::InvalidTaskReference(_)) => (),
|
Err(ParseError::InvalidTaskReference(_)) => (),
|
||||||
_ => panic!("Expected InvalidTaskReference error"),
|
other => panic!("Expected InvalidTaskReference error, got: {:?}", other),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn test_parallel_task() {
|
fn test_parallel_task_legacy() {
|
||||||
let yaml = r#"
|
let yaml = r#"
|
||||||
ref: test.parallel
|
ref: test.parallel
|
||||||
label: Parallel Workflow
|
label: Parallel Workflow
|
||||||
@@ -442,12 +678,357 @@ tasks:
|
|||||||
"#;
|
"#;
|
||||||
|
|
||||||
let result = parse_workflow_yaml(yaml);
|
let result = parse_workflow_yaml(yaml);
|
||||||
assert!(result.is_ok());
|
assert!(result.is_ok(), "Parse failed: {:?}", result.err());
|
||||||
let workflow = result.unwrap();
|
let workflow = result.unwrap();
|
||||||
assert_eq!(workflow.tasks[0].r#type, TaskType::Parallel);
|
assert_eq!(workflow.tasks[0].r#type, TaskType::Parallel);
|
||||||
assert_eq!(workflow.tasks[0].tasks.as_ref().unwrap().len(), 2);
|
assert_eq!(workflow.tasks[0].tasks.as_ref().unwrap().len(), 2);
|
||||||
|
// Legacy on_success converted to next
|
||||||
|
assert_eq!(workflow.tasks[0].next.len(), 1);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// -----------------------------------------------------------------------
|
||||||
|
// New format tests (Orquesta-style `next`)
|
||||||
|
// -----------------------------------------------------------------------
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_parse_next_format_simple() {
|
||||||
|
let yaml = r#"
|
||||||
|
ref: test.next_simple
|
||||||
|
label: Next Format Workflow
|
||||||
|
version: 1.0.0
|
||||||
|
tasks:
|
||||||
|
- name: task1
|
||||||
|
action: core.echo
|
||||||
|
input:
|
||||||
|
message: "Hello"
|
||||||
|
next:
|
||||||
|
- when: "{{ succeeded() }}"
|
||||||
|
do:
|
||||||
|
- task2
|
||||||
|
- name: task2
|
||||||
|
action: core.echo
|
||||||
|
input:
|
||||||
|
message: "World"
|
||||||
|
"#;
|
||||||
|
|
||||||
|
let result = parse_workflow_yaml(yaml);
|
||||||
|
assert!(result.is_ok(), "Parse failed: {:?}", result.err());
|
||||||
|
let workflow = result.unwrap();
|
||||||
|
assert_eq!(workflow.tasks.len(), 2);
|
||||||
|
assert_eq!(workflow.tasks[0].next.len(), 1);
|
||||||
|
assert_eq!(
|
||||||
|
workflow.tasks[0].next[0].when.as_deref(),
|
||||||
|
Some("{{ succeeded() }}")
|
||||||
|
);
|
||||||
|
assert_eq!(
|
||||||
|
workflow.tasks[0].next[0].r#do,
|
||||||
|
Some(vec!["task2".to_string()])
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_parse_next_format_multiple_transitions() {
|
||||||
|
let yaml = r#"
|
||||||
|
ref: test.next_multi
|
||||||
|
label: Multi-Transition Workflow
|
||||||
|
version: 1.0.0
|
||||||
|
tasks:
|
||||||
|
- name: task1
|
||||||
|
action: core.echo
|
||||||
|
next:
|
||||||
|
- when: "{{ succeeded() }}"
|
||||||
|
publish:
|
||||||
|
- msg: "task1 done"
|
||||||
|
- result_val: "{{ result() }}"
|
||||||
|
do:
|
||||||
|
- log
|
||||||
|
- task3
|
||||||
|
- when: "{{ failed() }}"
|
||||||
|
publish:
|
||||||
|
- msg: "task1 failed"
|
||||||
|
do:
|
||||||
|
- log
|
||||||
|
- error_handler
|
||||||
|
- name: task3
|
||||||
|
action: core.complete
|
||||||
|
- name: log
|
||||||
|
action: core.log
|
||||||
|
- name: error_handler
|
||||||
|
action: core.handle_error
|
||||||
|
"#;
|
||||||
|
|
||||||
|
let result = parse_workflow_yaml(yaml);
|
||||||
|
assert!(result.is_ok(), "Parse failed: {:?}", result.err());
|
||||||
|
let workflow = result.unwrap();
|
||||||
|
|
||||||
|
let task1 = &workflow.tasks[0];
|
||||||
|
assert_eq!(task1.next.len(), 2);
|
||||||
|
|
||||||
|
// First transition: succeeded
|
||||||
|
assert_eq!(task1.next[0].when.as_deref(), Some("{{ succeeded() }}"));
|
||||||
|
assert_eq!(task1.next[0].publish.len(), 2);
|
||||||
|
assert_eq!(
|
||||||
|
task1.next[0].r#do,
|
||||||
|
Some(vec!["log".to_string(), "task3".to_string()])
|
||||||
|
);
|
||||||
|
|
||||||
|
// Second transition: failed
|
||||||
|
assert_eq!(task1.next[1].when.as_deref(), Some("{{ failed() }}"));
|
||||||
|
assert_eq!(task1.next[1].publish.len(), 1);
|
||||||
|
assert_eq!(
|
||||||
|
task1.next[1].r#do,
|
||||||
|
Some(vec!["log".to_string(), "error_handler".to_string()])
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_parse_next_format_publish_only() {
|
||||||
|
let yaml = r#"
|
||||||
|
ref: test.publish_only
|
||||||
|
label: Publish Only Workflow
|
||||||
|
version: 1.0.0
|
||||||
|
tasks:
|
||||||
|
- name: compute
|
||||||
|
action: math.add
|
||||||
|
next:
|
||||||
|
- when: "{{ succeeded() }}"
|
||||||
|
publish:
|
||||||
|
- result: "{{ result() }}"
|
||||||
|
"#;
|
||||||
|
|
||||||
|
let result = parse_workflow_yaml(yaml);
|
||||||
|
assert!(result.is_ok(), "Parse failed: {:?}", result.err());
|
||||||
|
let workflow = result.unwrap();
|
||||||
|
let task = &workflow.tasks[0];
|
||||||
|
assert_eq!(task.next.len(), 1);
|
||||||
|
assert!(task.next[0].r#do.is_none());
|
||||||
|
assert_eq!(task.next[0].publish.len(), 1);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_parse_next_format_unconditional() {
|
||||||
|
let yaml = r#"
|
||||||
|
ref: test.unconditional
|
||||||
|
label: Unconditional Transition
|
||||||
|
version: 1.0.0
|
||||||
|
tasks:
|
||||||
|
- name: task1
|
||||||
|
action: core.echo
|
||||||
|
next:
|
||||||
|
- do:
|
||||||
|
- task2
|
||||||
|
- name: task2
|
||||||
|
action: core.echo
|
||||||
|
"#;
|
||||||
|
|
||||||
|
let result = parse_workflow_yaml(yaml);
|
||||||
|
assert!(result.is_ok(), "Parse failed: {:?}", result.err());
|
||||||
|
let workflow = result.unwrap();
|
||||||
|
assert_eq!(workflow.tasks[0].next.len(), 1);
|
||||||
|
assert!(workflow.tasks[0].next[0].when.is_none());
|
||||||
|
assert_eq!(
|
||||||
|
workflow.tasks[0].next[0].r#do,
|
||||||
|
Some(vec!["task2".to_string()])
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_next_takes_precedence_over_legacy() {
|
||||||
|
// When both `next` and legacy fields are present, `next` wins
|
||||||
|
let yaml = r#"
|
||||||
|
ref: test.precedence
|
||||||
|
label: Precedence Test
|
||||||
|
version: 1.0.0
|
||||||
|
tasks:
|
||||||
|
- name: task1
|
||||||
|
action: core.echo
|
||||||
|
on_success: task2
|
||||||
|
next:
|
||||||
|
- when: "{{ succeeded() }}"
|
||||||
|
do:
|
||||||
|
- task3
|
||||||
|
- name: task2
|
||||||
|
action: core.echo
|
||||||
|
- name: task3
|
||||||
|
action: core.echo
|
||||||
|
"#;
|
||||||
|
|
||||||
|
let result = parse_workflow_yaml(yaml);
|
||||||
|
assert!(result.is_ok(), "Parse failed: {:?}", result.err());
|
||||||
|
let workflow = result.unwrap();
|
||||||
|
let task1 = &workflow.tasks[0];
|
||||||
|
|
||||||
|
// `next` should contain only the explicit next entry, not the legacy one
|
||||||
|
assert_eq!(task1.next.len(), 1);
|
||||||
|
assert_eq!(task1.next[0].r#do, Some(vec!["task3".to_string()]));
|
||||||
|
// Legacy field should have been cleared
|
||||||
|
assert!(task1.on_success.is_none());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_invalid_task_reference_in_next() {
|
||||||
|
let yaml = r#"
|
||||||
|
ref: test.invalid_next_ref
|
||||||
|
label: Invalid Next Ref
|
||||||
|
version: 1.0.0
|
||||||
|
tasks:
|
||||||
|
- name: task1
|
||||||
|
action: core.echo
|
||||||
|
next:
|
||||||
|
- when: "{{ succeeded() }}"
|
||||||
|
do:
|
||||||
|
- nonexistent_task
|
||||||
|
"#;
|
||||||
|
|
||||||
|
let result = parse_workflow_yaml(yaml);
|
||||||
|
assert!(result.is_err());
|
||||||
|
match result {
|
||||||
|
Err(ParseError::InvalidTaskReference(msg)) => {
|
||||||
|
assert!(msg.contains("nonexistent_task"));
|
||||||
|
}
|
||||||
|
other => panic!("Expected InvalidTaskReference error, got: {:?}", other),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_cycles_allowed_in_next_format() {
|
||||||
|
let yaml = r#"
|
||||||
|
ref: test.cycle_next
|
||||||
|
label: Cycle with Next
|
||||||
|
version: 1.0.0
|
||||||
|
tasks:
|
||||||
|
- name: task1
|
||||||
|
action: core.echo
|
||||||
|
next:
|
||||||
|
- when: "{{ succeeded() }}"
|
||||||
|
do:
|
||||||
|
- task2
|
||||||
|
- name: task2
|
||||||
|
action: core.echo
|
||||||
|
next:
|
||||||
|
- when: "{{ succeeded() }}"
|
||||||
|
do:
|
||||||
|
- task1
|
||||||
|
"#;
|
||||||
|
|
||||||
|
let result = parse_workflow_yaml(yaml);
|
||||||
|
assert!(result.is_ok(), "Cycles should be allowed");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_legacy_all_transition_types() {
|
||||||
|
let yaml = r#"
|
||||||
|
ref: test.all_legacy
|
||||||
|
label: All Legacy Types
|
||||||
|
version: 1.0.0
|
||||||
|
tasks:
|
||||||
|
- name: task1
|
||||||
|
action: core.echo
|
||||||
|
on_success: task_s
|
||||||
|
on_failure: task_f
|
||||||
|
on_complete: task_c
|
||||||
|
on_timeout: task_t
|
||||||
|
- name: task_s
|
||||||
|
action: core.echo
|
||||||
|
- name: task_f
|
||||||
|
action: core.echo
|
||||||
|
- name: task_c
|
||||||
|
action: core.echo
|
||||||
|
- name: task_t
|
||||||
|
action: core.echo
|
||||||
|
"#;
|
||||||
|
|
||||||
|
let result = parse_workflow_yaml(yaml);
|
||||||
|
assert!(result.is_ok(), "Parse failed: {:?}", result.err());
|
||||||
|
let workflow = result.unwrap();
|
||||||
|
let task1 = &workflow.tasks[0];
|
||||||
|
|
||||||
|
// All legacy fields should be normalized into `next`
|
||||||
|
assert_eq!(task1.next.len(), 4);
|
||||||
|
assert!(task1.on_success.is_none());
|
||||||
|
assert!(task1.on_failure.is_none());
|
||||||
|
assert!(task1.on_complete.is_none());
|
||||||
|
assert!(task1.on_timeout.is_none());
|
||||||
|
|
||||||
|
// Check the order and conditions
|
||||||
|
assert_eq!(task1.next[0].when.as_deref(), Some("{{ succeeded() }}"));
|
||||||
|
assert_eq!(task1.next[0].r#do, Some(vec!["task_s".to_string()]));
|
||||||
|
|
||||||
|
assert_eq!(task1.next[1].when.as_deref(), Some("{{ failed() }}"));
|
||||||
|
assert_eq!(task1.next[1].r#do, Some(vec!["task_f".to_string()]));
|
||||||
|
|
||||||
|
// on_complete → unconditional
|
||||||
|
assert!(task1.next[2].when.is_none());
|
||||||
|
assert_eq!(task1.next[2].r#do, Some(vec!["task_c".to_string()]));
|
||||||
|
|
||||||
|
assert_eq!(task1.next[3].when.as_deref(), Some("{{ timed_out() }}"));
|
||||||
|
assert_eq!(task1.next[3].r#do, Some(vec!["task_t".to_string()]));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_legacy_publish_attached_to_succeeded_transition() {
|
||||||
|
let yaml = r#"
|
||||||
|
ref: test.legacy_publish
|
||||||
|
label: Legacy Publish
|
||||||
|
version: 1.0.0
|
||||||
|
tasks:
|
||||||
|
- name: task1
|
||||||
|
action: core.echo
|
||||||
|
on_success: task2
|
||||||
|
publish:
|
||||||
|
- result: "done"
|
||||||
|
- name: task2
|
||||||
|
action: core.echo
|
||||||
|
"#;
|
||||||
|
|
||||||
|
let result = parse_workflow_yaml(yaml);
|
||||||
|
assert!(result.is_ok(), "Parse failed: {:?}", result.err());
|
||||||
|
let workflow = result.unwrap();
|
||||||
|
let task1 = &workflow.tasks[0];
|
||||||
|
|
||||||
|
assert_eq!(task1.next.len(), 1);
|
||||||
|
assert_eq!(task1.next[0].publish.len(), 1);
|
||||||
|
assert!(task1.publish.is_empty()); // cleared after normalization
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_legacy_decision_branches() {
|
||||||
|
let yaml = r#"
|
||||||
|
ref: test.decision
|
||||||
|
label: Decision Workflow
|
||||||
|
version: 1.0.0
|
||||||
|
tasks:
|
||||||
|
- name: check
|
||||||
|
action: core.check
|
||||||
|
decision:
|
||||||
|
- when: "{{ result().status == 'ok' }}"
|
||||||
|
next: success_task
|
||||||
|
- when: "{{ result().status == 'error' }}"
|
||||||
|
next: error_task
|
||||||
|
- name: success_task
|
||||||
|
action: core.echo
|
||||||
|
- name: error_task
|
||||||
|
action: core.echo
|
||||||
|
"#;
|
||||||
|
|
||||||
|
let result = parse_workflow_yaml(yaml);
|
||||||
|
assert!(result.is_ok(), "Parse failed: {:?}", result.err());
|
||||||
|
let workflow = result.unwrap();
|
||||||
|
let task = &workflow.tasks[0];
|
||||||
|
|
||||||
|
assert_eq!(task.next.len(), 2);
|
||||||
|
assert!(task.decision.is_empty()); // cleared
|
||||||
|
assert_eq!(
|
||||||
|
task.next[0].when.as_deref(),
|
||||||
|
Some("{{ result().status == 'ok' }}")
|
||||||
|
);
|
||||||
|
assert_eq!(task.next[0].r#do, Some(vec!["success_task".to_string()]));
|
||||||
|
}
|
||||||
|
|
||||||
|
// -----------------------------------------------------------------------
|
||||||
|
// Existing tests
|
||||||
|
// -----------------------------------------------------------------------
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn test_with_items() {
|
fn test_with_items() {
|
||||||
let yaml = r#"
|
let yaml = r#"
|
||||||
@@ -471,27 +1052,98 @@ tasks:
|
|||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn test_retry_config() {
|
fn test_json_roundtrip() {
|
||||||
let yaml = r#"
|
let yaml = r#"
|
||||||
ref: test.retry
|
ref: test.roundtrip
|
||||||
label: Retry Workflow
|
label: Roundtrip Test
|
||||||
version: 1.0.0
|
version: 1.0.0
|
||||||
tasks:
|
tasks:
|
||||||
- name: flaky_task
|
- name: task1
|
||||||
action: core.flaky
|
action: core.echo
|
||||||
retry:
|
next:
|
||||||
count: 5
|
- when: "{{ succeeded() }}"
|
||||||
delay: 10
|
publish:
|
||||||
backoff: exponential
|
- msg: "done"
|
||||||
max_delay: 60
|
do:
|
||||||
|
- task2
|
||||||
|
- name: task2
|
||||||
|
action: core.echo
|
||||||
|
"#;
|
||||||
|
|
||||||
|
let workflow = parse_workflow_yaml(yaml).unwrap();
|
||||||
|
let json = workflow_to_json(&workflow).unwrap();
|
||||||
|
|
||||||
|
// Verify the JSON has the `next` array
|
||||||
|
let tasks = json.get("tasks").unwrap().as_array().unwrap();
|
||||||
|
let task1_next = tasks[0].get("next").unwrap().as_array().unwrap();
|
||||||
|
assert_eq!(task1_next.len(), 1);
|
||||||
|
assert_eq!(
|
||||||
|
task1_next[0].get("when").unwrap().as_str().unwrap(),
|
||||||
|
"{{ succeeded() }}"
|
||||||
|
);
|
||||||
|
|
||||||
|
// Verify legacy fields are absent
|
||||||
|
assert!(tasks[0].get("on_success").is_none());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_workflow_with_join() {
|
||||||
|
let yaml = r#"
|
||||||
|
ref: test.join
|
||||||
|
label: Join Workflow
|
||||||
|
version: 1.0.0
|
||||||
|
tasks:
|
||||||
|
- name: task1
|
||||||
|
action: core.echo
|
||||||
|
next:
|
||||||
|
- when: "{{ succeeded() }}"
|
||||||
|
do:
|
||||||
|
- task3
|
||||||
|
- name: task2
|
||||||
|
action: core.echo
|
||||||
|
next:
|
||||||
|
- when: "{{ succeeded() }}"
|
||||||
|
do:
|
||||||
|
- task3
|
||||||
|
- name: task3
|
||||||
|
join: 2
|
||||||
|
action: core.echo
|
||||||
"#;
|
"#;
|
||||||
|
|
||||||
let result = parse_workflow_yaml(yaml);
|
let result = parse_workflow_yaml(yaml);
|
||||||
assert!(result.is_ok());
|
assert!(result.is_ok(), "Parse failed: {:?}", result.err());
|
||||||
let workflow = result.unwrap();
|
let workflow = result.unwrap();
|
||||||
let retry = workflow.tasks[0].retry.as_ref().unwrap();
|
assert_eq!(workflow.tasks[2].join, Some(2));
|
||||||
assert_eq!(retry.count, 5);
|
}
|
||||||
assert_eq!(retry.delay, 10);
|
|
||||||
assert_eq!(retry.backoff, BackoffStrategy::Exponential);
|
#[test]
|
||||||
|
fn test_multiple_do_targets() {
|
||||||
|
let yaml = r#"
|
||||||
|
ref: test.multi_do
|
||||||
|
label: Multiple Do Targets
|
||||||
|
version: 1.0.0
|
||||||
|
tasks:
|
||||||
|
- name: task1
|
||||||
|
action: core.echo
|
||||||
|
next:
|
||||||
|
- when: "{{ succeeded() }}"
|
||||||
|
do:
|
||||||
|
- task2
|
||||||
|
- task3
|
||||||
|
- name: task2
|
||||||
|
action: core.echo
|
||||||
|
- name: task3
|
||||||
|
action: core.echo
|
||||||
|
"#;
|
||||||
|
|
||||||
|
let result = parse_workflow_yaml(yaml);
|
||||||
|
assert!(result.is_ok(), "Parse failed: {:?}", result.err());
|
||||||
|
let workflow = result.unwrap();
|
||||||
|
let task1 = &workflow.tasks[0];
|
||||||
|
assert_eq!(task1.next.len(), 1);
|
||||||
|
assert_eq!(
|
||||||
|
task1.next[0].r#do,
|
||||||
|
Some(vec!["task2".to_string(), "task3".to_string()])
|
||||||
|
);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -254,24 +254,11 @@ impl WorkflowValidator {
|
|||||||
let mut graph = HashMap::new();
|
let mut graph = HashMap::new();
|
||||||
|
|
||||||
for task in &workflow.tasks {
|
for task in &workflow.tasks {
|
||||||
let mut transitions = Vec::new();
|
let transitions: Vec<String> = task
|
||||||
|
.all_transition_targets()
|
||||||
if let Some(ref next) = task.on_success {
|
.into_iter()
|
||||||
transitions.push(next.clone());
|
.map(|s| s.to_string())
|
||||||
}
|
.collect();
|
||||||
if let Some(ref next) = task.on_failure {
|
|
||||||
transitions.push(next.clone());
|
|
||||||
}
|
|
||||||
if let Some(ref next) = task.on_complete {
|
|
||||||
transitions.push(next.clone());
|
|
||||||
}
|
|
||||||
if let Some(ref next) = task.on_timeout {
|
|
||||||
transitions.push(next.clone());
|
|
||||||
}
|
|
||||||
|
|
||||||
for branch in &task.decision {
|
|
||||||
transitions.push(branch.next.clone());
|
|
||||||
}
|
|
||||||
|
|
||||||
graph.insert(task.name.clone(), transitions);
|
graph.insert(task.name.clone(), transitions);
|
||||||
}
|
}
|
||||||
@@ -284,21 +271,8 @@ impl WorkflowValidator {
|
|||||||
let mut has_predecessor = HashSet::new();
|
let mut has_predecessor = HashSet::new();
|
||||||
|
|
||||||
for task in &workflow.tasks {
|
for task in &workflow.tasks {
|
||||||
if let Some(ref next) = task.on_success {
|
for target in task.all_transition_targets() {
|
||||||
has_predecessor.insert(next.clone());
|
has_predecessor.insert(target.to_string());
|
||||||
}
|
|
||||||
if let Some(ref next) = task.on_failure {
|
|
||||||
has_predecessor.insert(next.clone());
|
|
||||||
}
|
|
||||||
if let Some(ref next) = task.on_complete {
|
|
||||||
has_predecessor.insert(next.clone());
|
|
||||||
}
|
|
||||||
if let Some(ref next) = task.on_timeout {
|
|
||||||
has_predecessor.insert(next.clone());
|
|
||||||
}
|
|
||||||
|
|
||||||
for branch in &task.decision {
|
|
||||||
has_predecessor.insert(branch.next.clone());
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -3,6 +3,12 @@
|
|||||||
//! This module builds executable task graphs from workflow definitions.
|
//! This module builds executable task graphs from workflow definitions.
|
||||||
//! Workflows are directed graphs where tasks are nodes and transitions are edges.
|
//! Workflows are directed graphs where tasks are nodes and transitions are edges.
|
||||||
//! Execution follows transitions from completed tasks, naturally supporting cycles.
|
//! Execution follows transitions from completed tasks, naturally supporting cycles.
|
||||||
|
//!
|
||||||
|
//! Uses the Orquesta-style `next` transition model where each task has an ordered
|
||||||
|
//! list of transitions. Each transition can specify:
|
||||||
|
//! - `when` — a condition expression (e.g., "{{ succeeded() }}", "{{ failed() }}")
|
||||||
|
//! - `publish` — variables to publish into the workflow context
|
||||||
|
//! - `do` — next tasks to invoke when the condition is met
|
||||||
|
|
||||||
use attune_common::workflow::{Task, TaskType, WorkflowDefinition};
|
use attune_common::workflow::{Task, TaskType, WorkflowDefinition};
|
||||||
use std::collections::{HashMap, HashSet};
|
use std::collections::{HashMap, HashSet};
|
||||||
@@ -51,7 +57,7 @@ pub struct TaskNode {
|
|||||||
/// Input template
|
/// Input template
|
||||||
pub input: serde_json::Value,
|
pub input: serde_json::Value,
|
||||||
|
|
||||||
/// Conditional execution
|
/// Conditional execution (task-level — controls whether the task runs at all)
|
||||||
pub when: Option<String>,
|
pub when: Option<String>,
|
||||||
|
|
||||||
/// With-items iteration
|
/// With-items iteration
|
||||||
@@ -63,17 +69,14 @@ pub struct TaskNode {
|
|||||||
/// Concurrency limit
|
/// Concurrency limit
|
||||||
pub concurrency: Option<usize>,
|
pub concurrency: Option<usize>,
|
||||||
|
|
||||||
/// Variable publishing directives
|
|
||||||
pub publish: Vec<String>,
|
|
||||||
|
|
||||||
/// Retry configuration
|
/// Retry configuration
|
||||||
pub retry: Option<RetryConfig>,
|
pub retry: Option<RetryConfig>,
|
||||||
|
|
||||||
/// Timeout in seconds
|
/// Timeout in seconds
|
||||||
pub timeout: Option<u32>,
|
pub timeout: Option<u32>,
|
||||||
|
|
||||||
/// Transitions
|
/// Orquesta-style transitions — evaluated in order after task completes
|
||||||
pub transitions: TaskTransitions,
|
pub transitions: Vec<GraphTransition>,
|
||||||
|
|
||||||
/// Sub-tasks (for parallel tasks)
|
/// Sub-tasks (for parallel tasks)
|
||||||
pub sub_tasks: Option<Vec<TaskNode>>,
|
pub sub_tasks: Option<Vec<TaskNode>>,
|
||||||
@@ -85,22 +88,27 @@ pub struct TaskNode {
|
|||||||
pub join: Option<usize>,
|
pub join: Option<usize>,
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Task transitions
|
/// A single transition in the task graph (Orquesta-style).
|
||||||
#[derive(Debug, Clone, Default, serde::Serialize, serde::Deserialize)]
|
///
|
||||||
pub struct TaskTransitions {
|
/// Transitions are evaluated in order after a task completes. When `when` is
|
||||||
pub on_success: Option<String>,
|
/// `None` the transition is unconditional.
|
||||||
pub on_failure: Option<String>,
|
#[derive(Debug, Clone, serde::Serialize, serde::Deserialize)]
|
||||||
pub on_complete: Option<String>,
|
pub struct GraphTransition {
|
||||||
pub on_timeout: Option<String>,
|
/// Condition expression (e.g., "{{ succeeded() }}", "{{ failed() }}")
|
||||||
pub decision: Vec<DecisionBranch>,
|
pub when: Option<String>,
|
||||||
|
|
||||||
|
/// Variable publishing directives (key-value pairs)
|
||||||
|
pub publish: Vec<PublishVar>,
|
||||||
|
|
||||||
|
/// Next tasks to invoke when transition criteria is met
|
||||||
|
pub do_tasks: Vec<String>,
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Decision branch
|
/// A single publish variable (key = expression)
|
||||||
#[derive(Debug, Clone, serde::Serialize, serde::Deserialize)]
|
#[derive(Debug, Clone, serde::Serialize, serde::Deserialize)]
|
||||||
pub struct DecisionBranch {
|
pub struct PublishVar {
|
||||||
pub when: Option<String>,
|
pub name: String,
|
||||||
pub next: String,
|
pub expression: String,
|
||||||
pub default: bool,
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Retry configuration
|
/// Retry configuration
|
||||||
@@ -121,8 +129,56 @@ pub enum BackoffStrategy {
|
|||||||
Exponential,
|
Exponential,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// Transition classification helpers
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
/// Classify a `when` expression for quick matching.
|
||||||
|
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
|
||||||
|
pub enum TransitionKind {
|
||||||
|
/// Matches `succeeded()` expressions
|
||||||
|
Succeeded,
|
||||||
|
/// Matches `failed()` expressions
|
||||||
|
Failed,
|
||||||
|
/// Matches `timed_out()` expressions
|
||||||
|
TimedOut,
|
||||||
|
/// No condition — fires on any completion
|
||||||
|
Always,
|
||||||
|
/// Custom condition expression
|
||||||
|
Custom,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl GraphTransition {
|
||||||
|
/// Classify this transition's `when` expression into a [`TransitionKind`].
|
||||||
|
pub fn kind(&self) -> TransitionKind {
|
||||||
|
match &self.when {
|
||||||
|
None => TransitionKind::Always,
|
||||||
|
Some(expr) => {
|
||||||
|
let normalized = expr.to_lowercase().replace(|c: char| c.is_whitespace(), "");
|
||||||
|
if normalized.contains("succeeded()") {
|
||||||
|
TransitionKind::Succeeded
|
||||||
|
} else if normalized.contains("failed()") {
|
||||||
|
TransitionKind::Failed
|
||||||
|
} else if normalized.contains("timed_out()") {
|
||||||
|
TransitionKind::TimedOut
|
||||||
|
} else {
|
||||||
|
TransitionKind::Custom
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// TaskGraph implementation
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
impl TaskGraph {
|
impl TaskGraph {
|
||||||
/// Create a graph from a workflow definition
|
/// Create a graph from a workflow definition.
|
||||||
|
///
|
||||||
|
/// The workflow's tasks should already have their transitions normalized
|
||||||
|
/// (legacy `on_success`/`on_failure` fields merged into `next`) — this is
|
||||||
|
/// done automatically by [`attune_common::workflow::parse_workflow_yaml`].
|
||||||
pub fn from_workflow(workflow: &WorkflowDefinition) -> GraphResult<Self> {
|
pub fn from_workflow(workflow: &WorkflowDefinition) -> GraphResult<Self> {
|
||||||
let mut builder = GraphBuilder::new();
|
let mut builder = GraphBuilder::new();
|
||||||
|
|
||||||
@@ -149,39 +205,92 @@ impl TaskGraph {
|
|||||||
}
|
}
|
||||||
|
|
||||||
/// Get the next tasks to execute after a task completes.
|
/// Get the next tasks to execute after a task completes.
|
||||||
/// Evaluates transitions based on task status.
|
///
|
||||||
|
/// Evaluates transitions in order based on the task's completion status.
|
||||||
|
/// A transition fires if its `when` condition matches the task status:
|
||||||
|
/// - `succeeded()` fires when `success == true`
|
||||||
|
/// - `failed()` fires when `success == false`
|
||||||
|
/// - No condition (always) fires regardless
|
||||||
|
/// - Custom conditions are included (actual expression evaluation
|
||||||
|
/// happens in the workflow coordinator with runtime context)
|
||||||
|
///
|
||||||
|
/// Multiple transitions can fire — they are independent of each other.
|
||||||
///
|
///
|
||||||
/// # Arguments
|
/// # Arguments
|
||||||
/// * `task_name` - The name of the task that completed
|
/// * `task_name` - The name of the task that completed
|
||||||
/// * `success` - Whether the task succeeded
|
/// * `success` - Whether the task succeeded
|
||||||
///
|
///
|
||||||
/// # Returns
|
/// # Returns
|
||||||
/// A vector of task names to schedule next
|
/// A vector of (task_name, publish_vars) tuples to schedule next
|
||||||
pub fn next_tasks(&self, task_name: &str, success: bool) -> Vec<String> {
|
pub fn next_tasks(&self, task_name: &str, success: bool) -> Vec<String> {
|
||||||
let mut next = Vec::new();
|
let mut next = Vec::new();
|
||||||
|
|
||||||
if let Some(node) = self.nodes.get(task_name) {
|
if let Some(node) = self.nodes.get(task_name) {
|
||||||
// Check explicit transitions based on task status
|
for transition in &node.transitions {
|
||||||
if success {
|
let should_fire = match transition.kind() {
|
||||||
if let Some(ref next_task) = node.transitions.on_success {
|
TransitionKind::Succeeded => success,
|
||||||
next.push(next_task.clone());
|
TransitionKind::Failed => !success,
|
||||||
}
|
TransitionKind::TimedOut => !success, // timeout is a form of failure
|
||||||
} else if let Some(ref next_task) = node.transitions.on_failure {
|
TransitionKind::Always => true,
|
||||||
next.push(next_task.clone());
|
TransitionKind::Custom => true, // include custom — real eval in coordinator
|
||||||
}
|
};
|
||||||
|
|
||||||
// on_complete runs regardless of success/failure
|
if should_fire {
|
||||||
if let Some(ref next_task) = node.transitions.on_complete {
|
for target in &transition.do_tasks {
|
||||||
next.push(next_task.clone());
|
if !next.contains(target) {
|
||||||
|
next.push(target.clone());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Decision branches (evaluated separately in coordinator with context)
|
|
||||||
// We don't evaluate them here since they need runtime context
|
|
||||||
}
|
}
|
||||||
|
|
||||||
next
|
next
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Get the next tasks with full transition information.
|
||||||
|
///
|
||||||
|
/// Returns matching transitions with their publish directives and targets,
|
||||||
|
/// giving the coordinator full context for variable publishing.
|
||||||
|
pub fn matching_transitions(&self, task_name: &str, success: bool) -> Vec<&GraphTransition> {
|
||||||
|
let mut matching = Vec::new();
|
||||||
|
|
||||||
|
if let Some(node) = self.nodes.get(task_name) {
|
||||||
|
for transition in &node.transitions {
|
||||||
|
let should_fire = match transition.kind() {
|
||||||
|
TransitionKind::Succeeded => success,
|
||||||
|
TransitionKind::Failed => !success,
|
||||||
|
TransitionKind::TimedOut => !success,
|
||||||
|
TransitionKind::Always => true,
|
||||||
|
TransitionKind::Custom => true,
|
||||||
|
};
|
||||||
|
|
||||||
|
if should_fire {
|
||||||
|
matching.push(transition);
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
matching
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Collect all unique target task names from all transitions of a given task.
|
||||||
|
pub fn all_transition_targets(&self, task_name: &str) -> HashSet<String> {
|
||||||
|
let mut targets = HashSet::new();
|
||||||
|
if let Some(node) = self.nodes.get(task_name) {
|
||||||
|
for transition in &node.transitions {
|
||||||
|
for target in &transition.do_tasks {
|
||||||
|
targets.insert(target.clone());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
targets
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// Graph builder
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
/// Graph builder helper
|
/// Graph builder helper
|
||||||
struct GraphBuilder {
|
struct GraphBuilder {
|
||||||
@@ -198,14 +307,12 @@ impl GraphBuilder {
|
|||||||
}
|
}
|
||||||
|
|
||||||
fn add_task(&mut self, task: &Task) -> GraphResult<()> {
|
fn add_task(&mut self, task: &Task) -> GraphResult<()> {
|
||||||
let node = self.task_to_node(task)?;
|
let node = Self::task_to_node(task)?;
|
||||||
self.nodes.insert(task.name.clone(), node);
|
self.nodes.insert(task.name.clone(), node);
|
||||||
Ok(())
|
Ok(())
|
||||||
}
|
}
|
||||||
|
|
||||||
fn task_to_node(&self, task: &Task) -> GraphResult<TaskNode> {
|
fn task_to_node(task: &Task) -> GraphResult<TaskNode> {
|
||||||
let publish = extract_publish_vars(&task.publish);
|
|
||||||
|
|
||||||
let retry = task.retry.as_ref().map(|r| RetryConfig {
|
let retry = task.retry.as_ref().map(|r| RetryConfig {
|
||||||
count: r.count,
|
count: r.count,
|
||||||
delay: r.delay,
|
delay: r.delay,
|
||||||
@@ -220,26 +327,21 @@ impl GraphBuilder {
|
|||||||
on_error: r.on_error.clone(),
|
on_error: r.on_error.clone(),
|
||||||
});
|
});
|
||||||
|
|
||||||
let transitions = TaskTransitions {
|
// Convert parser TaskTransition list → graph GraphTransition list
|
||||||
on_success: task.on_success.clone(),
|
let transitions: Vec<GraphTransition> = task
|
||||||
on_failure: task.on_failure.clone(),
|
.next
|
||||||
on_complete: task.on_complete.clone(),
|
|
||||||
on_timeout: task.on_timeout.clone(),
|
|
||||||
decision: task
|
|
||||||
.decision
|
|
||||||
.iter()
|
.iter()
|
||||||
.map(|d| DecisionBranch {
|
.map(|t| GraphTransition {
|
||||||
when: d.when.clone(),
|
when: t.when.clone(),
|
||||||
next: d.next.clone(),
|
publish: extract_publish_vars(&t.publish),
|
||||||
default: d.default,
|
do_tasks: t.r#do.clone().unwrap_or_default(),
|
||||||
})
|
})
|
||||||
.collect(),
|
.collect();
|
||||||
};
|
|
||||||
|
|
||||||
let sub_tasks = if let Some(ref tasks) = task.tasks {
|
let sub_tasks = if let Some(ref tasks) = task.tasks {
|
||||||
let mut sub_nodes = Vec::new();
|
let mut sub_nodes = Vec::new();
|
||||||
for subtask in tasks {
|
for subtask in tasks {
|
||||||
sub_nodes.push(self.task_to_node(subtask)?);
|
sub_nodes.push(Self::task_to_node(subtask)?);
|
||||||
}
|
}
|
||||||
Some(sub_nodes)
|
Some(sub_nodes)
|
||||||
} else {
|
} else {
|
||||||
@@ -255,7 +357,6 @@ impl GraphBuilder {
|
|||||||
with_items: task.with_items.clone(),
|
with_items: task.with_items.clone(),
|
||||||
batch_size: task.batch_size,
|
batch_size: task.batch_size,
|
||||||
concurrency: task.concurrency,
|
concurrency: task.concurrency,
|
||||||
publish,
|
|
||||||
retry,
|
retry,
|
||||||
timeout: task.timeout,
|
timeout: task.timeout,
|
||||||
transitions,
|
transitions,
|
||||||
@@ -268,7 +369,6 @@ impl GraphBuilder {
|
|||||||
fn build(mut self) -> GraphResult<Self> {
|
fn build(mut self) -> GraphResult<Self> {
|
||||||
// Compute inbound edges from transitions
|
// Compute inbound edges from transitions
|
||||||
self.compute_inbound_edges()?;
|
self.compute_inbound_edges()?;
|
||||||
|
|
||||||
Ok(self)
|
Ok(self)
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -276,17 +376,16 @@ impl GraphBuilder {
|
|||||||
let node_names: Vec<String> = self.nodes.keys().cloned().collect();
|
let node_names: Vec<String> = self.nodes.keys().cloned().collect();
|
||||||
|
|
||||||
for node_name in &node_names {
|
for node_name in &node_names {
|
||||||
if let Some(node) = self.nodes.get(node_name) {
|
// Collect all successor task names from this node's transitions
|
||||||
// Collect all tasks this task can transition to
|
let successors: Vec<String> = {
|
||||||
let successors = vec![
|
let node = self.nodes.get(node_name).unwrap();
|
||||||
node.transitions.on_success.as_ref(),
|
node.transitions
|
||||||
node.transitions.on_failure.as_ref(),
|
.iter()
|
||||||
node.transitions.on_complete.as_ref(),
|
.flat_map(|t| t.do_tasks.iter().cloned())
|
||||||
node.transitions.on_timeout.as_ref(),
|
.collect()
|
||||||
];
|
};
|
||||||
|
|
||||||
// For each successor, record this task as an inbound edge
|
for successor in &successors {
|
||||||
for successor in successors.into_iter().flatten() {
|
|
||||||
if !self.nodes.contains_key(successor) {
|
if !self.nodes.contains_key(successor) {
|
||||||
return Err(GraphError::InvalidTaskReference(format!(
|
return Err(GraphError::InvalidTaskReference(format!(
|
||||||
"Task '{}' references non-existent task '{}'",
|
"Task '{}' references non-existent task '{}'",
|
||||||
@@ -296,25 +395,9 @@ impl GraphBuilder {
|
|||||||
|
|
||||||
self.inbound_edges
|
self.inbound_edges
|
||||||
.entry(successor.clone())
|
.entry(successor.clone())
|
||||||
.or_insert_with(HashSet::new)
|
.or_default()
|
||||||
.insert(node_name.clone());
|
.insert(node_name.clone());
|
||||||
}
|
}
|
||||||
|
|
||||||
// Add decision branch edges
|
|
||||||
for branch in &node.transitions.decision {
|
|
||||||
if !self.nodes.contains_key(&branch.next) {
|
|
||||||
return Err(GraphError::InvalidTaskReference(format!(
|
|
||||||
"Task '{}' decision references non-existent task '{}'",
|
|
||||||
node_name, branch.next
|
|
||||||
)));
|
|
||||||
}
|
|
||||||
|
|
||||||
self.inbound_edges
|
|
||||||
.entry(branch.next.clone())
|
|
||||||
.or_insert_with(HashSet::new)
|
|
||||||
.insert(node_name.clone());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// Update node inbound_tasks
|
// Update node inbound_tasks
|
||||||
@@ -350,7 +433,7 @@ impl From<GraphBuilder> for TaskGraph {
|
|||||||
for source in inbound {
|
for source in inbound {
|
||||||
outbound_edges
|
outbound_edges
|
||||||
.entry(source.clone())
|
.entry(source.clone())
|
||||||
.or_insert_with(HashSet::new)
|
.or_default()
|
||||||
.insert(task.clone());
|
.insert(task.clone());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -364,24 +447,40 @@ impl From<GraphBuilder> for TaskGraph {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Extract variable names from publish directives
|
// ---------------------------------------------------------------------------
|
||||||
fn extract_publish_vars(publish: &[attune_common::workflow::PublishDirective]) -> Vec<String> {
|
// Publish variable extraction
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
/// Extract publish variable names and expressions from parser publish directives.
|
||||||
|
fn extract_publish_vars(publish: &[attune_common::workflow::PublishDirective]) -> Vec<PublishVar> {
|
||||||
use attune_common::workflow::PublishDirective;
|
use attune_common::workflow::PublishDirective;
|
||||||
|
|
||||||
let mut vars = Vec::new();
|
let mut vars = Vec::new();
|
||||||
for directive in publish {
|
for directive in publish {
|
||||||
match directive {
|
match directive {
|
||||||
PublishDirective::Simple(map) => {
|
PublishDirective::Simple(map) => {
|
||||||
vars.extend(map.keys().cloned());
|
for (key, value) in map {
|
||||||
|
vars.push(PublishVar {
|
||||||
|
name: key.clone(),
|
||||||
|
expression: value.clone(),
|
||||||
|
});
|
||||||
|
}
|
||||||
}
|
}
|
||||||
PublishDirective::Key(key) => {
|
PublishDirective::Key(key) => {
|
||||||
vars.push(key.clone());
|
vars.push(PublishVar {
|
||||||
|
name: key.clone(),
|
||||||
|
expression: "{{ result() }}".to_string(),
|
||||||
|
});
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
vars
|
vars
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// Tests
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
#[cfg(test)]
|
#[cfg(test)]
|
||||||
mod tests {
|
mod tests {
|
||||||
use super::*;
|
use super::*;
|
||||||
@@ -396,10 +495,16 @@ version: 1.0.0
|
|||||||
tasks:
|
tasks:
|
||||||
- name: task1
|
- name: task1
|
||||||
action: core.echo
|
action: core.echo
|
||||||
on_success: task2
|
next:
|
||||||
|
- when: "{{ succeeded() }}"
|
||||||
|
do:
|
||||||
|
- task2
|
||||||
- name: task2
|
- name: task2
|
||||||
action: core.echo
|
action: core.echo
|
||||||
on_success: task3
|
next:
|
||||||
|
- when: "{{ succeeded() }}"
|
||||||
|
do:
|
||||||
|
- task3
|
||||||
- name: task3
|
- name: task3
|
||||||
action: core.echo
|
action: core.echo
|
||||||
"#;
|
"#;
|
||||||
@@ -422,7 +527,7 @@ tasks:
|
|||||||
assert_eq!(graph.inbound_edges["task3"].len(), 1);
|
assert_eq!(graph.inbound_edges["task3"].len(), 1);
|
||||||
assert!(graph.inbound_edges["task3"].contains("task2"));
|
assert!(graph.inbound_edges["task3"].contains("task2"));
|
||||||
|
|
||||||
// Check transitions
|
// Check transitions via next_tasks
|
||||||
let next = graph.next_tasks("task1", true);
|
let next = graph.next_tasks("task1", true);
|
||||||
assert_eq!(next.len(), 1);
|
assert_eq!(next.len(), 1);
|
||||||
assert_eq!(next[0], "task2");
|
assert_eq!(next[0], "task2");
|
||||||
@@ -433,40 +538,11 @@ tasks:
|
|||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn test_parallel_entry_points() {
|
fn test_simple_sequential_graph_legacy() {
|
||||||
|
// Legacy format should still work (parser normalizes to `next`)
|
||||||
let yaml = r#"
|
let yaml = r#"
|
||||||
ref: test.parallel_start
|
ref: test.sequential_legacy
|
||||||
label: Parallel Start
|
label: Sequential Workflow (Legacy)
|
||||||
version: 1.0.0
|
|
||||||
tasks:
|
|
||||||
- name: task1
|
|
||||||
action: core.echo
|
|
||||||
on_success: final
|
|
||||||
- name: task2
|
|
||||||
action: core.echo
|
|
||||||
on_success: final
|
|
||||||
- name: final
|
|
||||||
action: core.complete
|
|
||||||
"#;
|
|
||||||
|
|
||||||
let workflow = workflow::parse_workflow_yaml(yaml).unwrap();
|
|
||||||
let graph = TaskGraph::from_workflow(&workflow).unwrap();
|
|
||||||
|
|
||||||
assert_eq!(graph.entry_points.len(), 2);
|
|
||||||
assert!(graph.entry_points.contains(&"task1".to_string()));
|
|
||||||
assert!(graph.entry_points.contains(&"task2".to_string()));
|
|
||||||
|
|
||||||
// final task should have both as inbound edges
|
|
||||||
assert_eq!(graph.inbound_edges["final"].len(), 2);
|
|
||||||
assert!(graph.inbound_edges["final"].contains("task1"));
|
|
||||||
assert!(graph.inbound_edges["final"].contains("task2"));
|
|
||||||
}
|
|
||||||
|
|
||||||
#[test]
|
|
||||||
fn test_transitions() {
|
|
||||||
let yaml = r#"
|
|
||||||
ref: test.transitions
|
|
||||||
label: Transition Test
|
|
||||||
version: 1.0.0
|
version: 1.0.0
|
||||||
tasks:
|
tasks:
|
||||||
- name: task1
|
- name: task1
|
||||||
@@ -482,18 +558,155 @@ tasks:
|
|||||||
let workflow = workflow::parse_workflow_yaml(yaml).unwrap();
|
let workflow = workflow::parse_workflow_yaml(yaml).unwrap();
|
||||||
let graph = TaskGraph::from_workflow(&workflow).unwrap();
|
let graph = TaskGraph::from_workflow(&workflow).unwrap();
|
||||||
|
|
||||||
// Test next_tasks follows transitions
|
assert_eq!(graph.nodes.len(), 3);
|
||||||
|
assert_eq!(graph.entry_points.len(), 1);
|
||||||
|
|
||||||
let next = graph.next_tasks("task1", true);
|
let next = graph.next_tasks("task1", true);
|
||||||
assert_eq!(next, vec!["task2"]);
|
assert_eq!(next, vec!["task2"]);
|
||||||
|
|
||||||
let next = graph.next_tasks("task2", true);
|
let next = graph.next_tasks("task2", true);
|
||||||
assert_eq!(next, vec!["task3"]);
|
assert_eq!(next, vec!["task3"]);
|
||||||
|
}
|
||||||
|
|
||||||
// task3 has no transitions
|
#[test]
|
||||||
let next = graph.next_tasks("task3", true);
|
fn test_parallel_entry_points() {
|
||||||
|
let yaml = r#"
|
||||||
|
ref: test.parallel_start
|
||||||
|
label: Parallel Start
|
||||||
|
version: 1.0.0
|
||||||
|
tasks:
|
||||||
|
- name: task1
|
||||||
|
action: core.echo
|
||||||
|
next:
|
||||||
|
- when: "{{ succeeded() }}"
|
||||||
|
do:
|
||||||
|
- final_task
|
||||||
|
- name: task2
|
||||||
|
action: core.echo
|
||||||
|
next:
|
||||||
|
- when: "{{ succeeded() }}"
|
||||||
|
do:
|
||||||
|
- final_task
|
||||||
|
- name: final_task
|
||||||
|
action: core.complete
|
||||||
|
"#;
|
||||||
|
|
||||||
|
let workflow = workflow::parse_workflow_yaml(yaml).unwrap();
|
||||||
|
let graph = TaskGraph::from_workflow(&workflow).unwrap();
|
||||||
|
|
||||||
|
assert_eq!(graph.entry_points.len(), 2);
|
||||||
|
assert!(graph.entry_points.contains(&"task1".to_string()));
|
||||||
|
assert!(graph.entry_points.contains(&"task2".to_string()));
|
||||||
|
|
||||||
|
// final_task should have both as inbound edges
|
||||||
|
assert_eq!(graph.inbound_edges["final_task"].len(), 2);
|
||||||
|
assert!(graph.inbound_edges["final_task"].contains("task1"));
|
||||||
|
assert!(graph.inbound_edges["final_task"].contains("task2"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_transitions_success_and_failure() {
|
||||||
|
let yaml = r#"
|
||||||
|
ref: test.transitions
|
||||||
|
label: Transition Test
|
||||||
|
version: 1.0.0
|
||||||
|
tasks:
|
||||||
|
- name: task1
|
||||||
|
action: core.echo
|
||||||
|
next:
|
||||||
|
- when: "{{ succeeded() }}"
|
||||||
|
do:
|
||||||
|
- task2
|
||||||
|
- when: "{{ failed() }}"
|
||||||
|
do:
|
||||||
|
- error_handler
|
||||||
|
- name: task2
|
||||||
|
action: core.echo
|
||||||
|
- name: error_handler
|
||||||
|
action: core.handle_error
|
||||||
|
"#;
|
||||||
|
|
||||||
|
let workflow = workflow::parse_workflow_yaml(yaml).unwrap();
|
||||||
|
let graph = TaskGraph::from_workflow(&workflow).unwrap();
|
||||||
|
|
||||||
|
// On success, should go to task2
|
||||||
|
let next = graph.next_tasks("task1", true);
|
||||||
|
assert_eq!(next, vec!["task2"]);
|
||||||
|
|
||||||
|
// On failure, should go to error_handler
|
||||||
|
let next = graph.next_tasks("task1", false);
|
||||||
|
assert_eq!(next, vec!["error_handler"]);
|
||||||
|
|
||||||
|
// task2 has no transitions
|
||||||
|
let next = graph.next_tasks("task2", true);
|
||||||
assert!(next.is_empty());
|
assert!(next.is_empty());
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_multiple_do_targets() {
|
||||||
|
let yaml = r#"
|
||||||
|
ref: test.multi_do
|
||||||
|
label: Multi Do Targets
|
||||||
|
version: 1.0.0
|
||||||
|
tasks:
|
||||||
|
- name: task1
|
||||||
|
action: core.echo
|
||||||
|
next:
|
||||||
|
- when: "{{ succeeded() }}"
|
||||||
|
publish:
|
||||||
|
- msg: "task1 done"
|
||||||
|
do:
|
||||||
|
- log
|
||||||
|
- task2
|
||||||
|
- name: task2
|
||||||
|
action: core.echo
|
||||||
|
- name: log
|
||||||
|
action: core.log
|
||||||
|
"#;
|
||||||
|
|
||||||
|
let workflow = workflow::parse_workflow_yaml(yaml).unwrap();
|
||||||
|
let graph = TaskGraph::from_workflow(&workflow).unwrap();
|
||||||
|
|
||||||
|
let next = graph.next_tasks("task1", true);
|
||||||
|
assert_eq!(next.len(), 2);
|
||||||
|
assert!(next.contains(&"log".to_string()));
|
||||||
|
assert!(next.contains(&"task2".to_string()));
|
||||||
|
|
||||||
|
// Check publish vars
|
||||||
|
let transitions = graph.matching_transitions("task1", true);
|
||||||
|
assert_eq!(transitions.len(), 1);
|
||||||
|
assert_eq!(transitions[0].publish.len(), 1);
|
||||||
|
assert_eq!(transitions[0].publish[0].name, "msg");
|
||||||
|
assert_eq!(transitions[0].publish[0].expression, "task1 done");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_unconditional_transition() {
|
||||||
|
let yaml = r#"
|
||||||
|
ref: test.unconditional
|
||||||
|
label: Unconditional
|
||||||
|
version: 1.0.0
|
||||||
|
tasks:
|
||||||
|
- name: task1
|
||||||
|
action: core.echo
|
||||||
|
next:
|
||||||
|
- do:
|
||||||
|
- task2
|
||||||
|
- name: task2
|
||||||
|
action: core.echo
|
||||||
|
"#;
|
||||||
|
|
||||||
|
let workflow = workflow::parse_workflow_yaml(yaml).unwrap();
|
||||||
|
let graph = TaskGraph::from_workflow(&workflow).unwrap();
|
||||||
|
|
||||||
|
// Unconditional fires on both success and failure
|
||||||
|
let next = graph.next_tasks("task1", true);
|
||||||
|
assert_eq!(next, vec!["task2"]);
|
||||||
|
|
||||||
|
let next = graph.next_tasks("task1", false);
|
||||||
|
assert_eq!(next, vec!["task2"]);
|
||||||
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn test_cycle_support() {
|
fn test_cycle_support() {
|
||||||
let yaml = r#"
|
let yaml = r#"
|
||||||
@@ -503,8 +716,13 @@ version: 1.0.0
|
|||||||
tasks:
|
tasks:
|
||||||
- name: check
|
- name: check
|
||||||
action: core.check
|
action: core.check
|
||||||
on_success: process
|
next:
|
||||||
on_failure: check
|
- when: "{{ succeeded() }}"
|
||||||
|
do:
|
||||||
|
- process
|
||||||
|
- when: "{{ failed() }}"
|
||||||
|
do:
|
||||||
|
- check
|
||||||
- name: process
|
- name: process
|
||||||
action: core.process
|
action: core.process
|
||||||
"#;
|
"#;
|
||||||
@@ -513,13 +731,12 @@ tasks:
|
|||||||
// Should not error on cycles
|
// Should not error on cycles
|
||||||
let graph = TaskGraph::from_workflow(&workflow).unwrap();
|
let graph = TaskGraph::from_workflow(&workflow).unwrap();
|
||||||
|
|
||||||
// Note: check has a self-reference (check -> check on failure)
|
// check has a self-reference (check -> check on failure)
|
||||||
// So it has an inbound edge and is not an entry point
|
// So it has an inbound edge and is not an entry point
|
||||||
// process also has an inbound edge (check -> process on success)
|
// process also has an inbound edge (check -> process on success)
|
||||||
// Therefore, there are no entry points in this workflow
|
|
||||||
assert_eq!(graph.entry_points.len(), 0);
|
assert_eq!(graph.entry_points.len(), 0);
|
||||||
|
|
||||||
// check can transition to itself on failure (cycle)
|
// check transitions to itself on failure (cycle)
|
||||||
let next = graph.next_tasks("check", false);
|
let next = graph.next_tasks("check", false);
|
||||||
assert_eq!(next, vec!["check"]);
|
assert_eq!(next, vec!["check"]);
|
||||||
|
|
||||||
@@ -537,18 +754,24 @@ version: 1.0.0
|
|||||||
tasks:
|
tasks:
|
||||||
- name: task1
|
- name: task1
|
||||||
action: core.echo
|
action: core.echo
|
||||||
on_success: final
|
next:
|
||||||
|
- when: "{{ succeeded() }}"
|
||||||
|
do:
|
||||||
|
- final_task
|
||||||
- name: task2
|
- name: task2
|
||||||
action: core.echo
|
action: core.echo
|
||||||
on_success: final
|
next:
|
||||||
- name: final
|
- when: "{{ succeeded() }}"
|
||||||
|
do:
|
||||||
|
- final_task
|
||||||
|
- name: final_task
|
||||||
action: core.complete
|
action: core.complete
|
||||||
"#;
|
"#;
|
||||||
|
|
||||||
let workflow = workflow::parse_workflow_yaml(yaml).unwrap();
|
let workflow = workflow::parse_workflow_yaml(yaml).unwrap();
|
||||||
let graph = TaskGraph::from_workflow(&workflow).unwrap();
|
let graph = TaskGraph::from_workflow(&workflow).unwrap();
|
||||||
|
|
||||||
let inbound = graph.get_inbound_tasks("final");
|
let inbound = graph.get_inbound_tasks("final_task");
|
||||||
assert_eq!(inbound.len(), 2);
|
assert_eq!(inbound.len(), 2);
|
||||||
assert!(inbound.contains(&"task1".to_string()));
|
assert!(inbound.contains(&"task1".to_string()));
|
||||||
assert!(inbound.contains(&"task2".to_string()));
|
assert!(inbound.contains(&"task2".to_string()));
|
||||||
@@ -556,4 +779,156 @@ tasks:
|
|||||||
let inbound = graph.get_inbound_tasks("task1");
|
let inbound = graph.get_inbound_tasks("task1");
|
||||||
assert_eq!(inbound.len(), 0);
|
assert_eq!(inbound.len(), 0);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_transition_kind_classification() {
|
||||||
|
let succeeded = GraphTransition {
|
||||||
|
when: Some("{{ succeeded() }}".to_string()),
|
||||||
|
publish: vec![],
|
||||||
|
do_tasks: vec!["t".to_string()],
|
||||||
|
};
|
||||||
|
assert_eq!(succeeded.kind(), TransitionKind::Succeeded);
|
||||||
|
|
||||||
|
let failed = GraphTransition {
|
||||||
|
when: Some("{{ failed() }}".to_string()),
|
||||||
|
publish: vec![],
|
||||||
|
do_tasks: vec!["t".to_string()],
|
||||||
|
};
|
||||||
|
assert_eq!(failed.kind(), TransitionKind::Failed);
|
||||||
|
|
||||||
|
let timed_out = GraphTransition {
|
||||||
|
when: Some("{{ timed_out() }}".to_string()),
|
||||||
|
publish: vec![],
|
||||||
|
do_tasks: vec!["t".to_string()],
|
||||||
|
};
|
||||||
|
assert_eq!(timed_out.kind(), TransitionKind::TimedOut);
|
||||||
|
|
||||||
|
let always = GraphTransition {
|
||||||
|
when: None,
|
||||||
|
publish: vec![],
|
||||||
|
do_tasks: vec!["t".to_string()],
|
||||||
|
};
|
||||||
|
assert_eq!(always.kind(), TransitionKind::Always);
|
||||||
|
|
||||||
|
let custom = GraphTransition {
|
||||||
|
when: Some("{{ result().status == 'ok' }}".to_string()),
|
||||||
|
publish: vec![],
|
||||||
|
do_tasks: vec!["t".to_string()],
|
||||||
|
};
|
||||||
|
assert_eq!(custom.kind(), TransitionKind::Custom);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_publish_extraction() {
|
||||||
|
let yaml = r#"
|
||||||
|
ref: test.publish
|
||||||
|
label: Publish Test
|
||||||
|
version: 1.0.0
|
||||||
|
tasks:
|
||||||
|
- name: task1
|
||||||
|
action: core.echo
|
||||||
|
next:
|
||||||
|
- when: "{{ succeeded() }}"
|
||||||
|
publish:
|
||||||
|
- result_val: "{{ result() }}"
|
||||||
|
- msg: "done"
|
||||||
|
do:
|
||||||
|
- task2
|
||||||
|
- name: task2
|
||||||
|
action: core.echo
|
||||||
|
"#;
|
||||||
|
|
||||||
|
let workflow = workflow::parse_workflow_yaml(yaml).unwrap();
|
||||||
|
let graph = TaskGraph::from_workflow(&workflow).unwrap();
|
||||||
|
|
||||||
|
let task1 = graph.get_task("task1").unwrap();
|
||||||
|
assert_eq!(task1.transitions.len(), 1);
|
||||||
|
assert_eq!(task1.transitions[0].publish.len(), 2);
|
||||||
|
|
||||||
|
// Note: HashMap ordering is not guaranteed, so just check both exist
|
||||||
|
let publish_names: Vec<&str> = task1.transitions[0]
|
||||||
|
.publish
|
||||||
|
.iter()
|
||||||
|
.map(|p| p.name.as_str())
|
||||||
|
.collect();
|
||||||
|
assert!(publish_names.contains(&"result_val"));
|
||||||
|
assert!(publish_names.contains(&"msg"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_all_transition_targets() {
|
||||||
|
let yaml = r#"
|
||||||
|
ref: test.all_targets
|
||||||
|
label: All Targets Test
|
||||||
|
version: 1.0.0
|
||||||
|
tasks:
|
||||||
|
- name: task1
|
||||||
|
action: core.echo
|
||||||
|
next:
|
||||||
|
- when: "{{ succeeded() }}"
|
||||||
|
do:
|
||||||
|
- task2
|
||||||
|
- task3
|
||||||
|
- when: "{{ failed() }}"
|
||||||
|
do:
|
||||||
|
- error_handler
|
||||||
|
- name: task2
|
||||||
|
action: core.echo
|
||||||
|
- name: task3
|
||||||
|
action: core.echo
|
||||||
|
- name: error_handler
|
||||||
|
action: core.handle_error
|
||||||
|
"#;
|
||||||
|
|
||||||
|
let workflow = workflow::parse_workflow_yaml(yaml).unwrap();
|
||||||
|
let graph = TaskGraph::from_workflow(&workflow).unwrap();
|
||||||
|
|
||||||
|
let targets = graph.all_transition_targets("task1");
|
||||||
|
assert_eq!(targets.len(), 3);
|
||||||
|
assert!(targets.contains("task2"));
|
||||||
|
assert!(targets.contains("task3"));
|
||||||
|
assert!(targets.contains("error_handler"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_mixed_success_failure_and_always() {
|
||||||
|
let yaml = r#"
|
||||||
|
ref: test.mixed
|
||||||
|
label: Mixed Transitions
|
||||||
|
version: 1.0.0
|
||||||
|
tasks:
|
||||||
|
- name: task1
|
||||||
|
action: core.echo
|
||||||
|
next:
|
||||||
|
- when: "{{ succeeded() }}"
|
||||||
|
do:
|
||||||
|
- success_task
|
||||||
|
- when: "{{ failed() }}"
|
||||||
|
do:
|
||||||
|
- failure_task
|
||||||
|
- do:
|
||||||
|
- always_task
|
||||||
|
- name: success_task
|
||||||
|
action: core.echo
|
||||||
|
- name: failure_task
|
||||||
|
action: core.echo
|
||||||
|
- name: always_task
|
||||||
|
action: core.echo
|
||||||
|
"#;
|
||||||
|
|
||||||
|
let workflow = workflow::parse_workflow_yaml(yaml).unwrap();
|
||||||
|
let graph = TaskGraph::from_workflow(&workflow).unwrap();
|
||||||
|
|
||||||
|
// On success: succeeded + always fire
|
||||||
|
let next = graph.next_tasks("task1", true);
|
||||||
|
assert_eq!(next.len(), 2);
|
||||||
|
assert!(next.contains(&"success_task".to_string()));
|
||||||
|
assert!(next.contains(&"always_task".to_string()));
|
||||||
|
|
||||||
|
// On failure: failed + always fire
|
||||||
|
let next = graph.next_tasks("task1", false);
|
||||||
|
assert_eq!(next.len(), 2);
|
||||||
|
assert!(next.contains(&"failure_task".to_string()));
|
||||||
|
assert!(next.contains(&"always_task".to_string()));
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -53,7 +53,7 @@ pub use coordinator::{
|
|||||||
WorkflowCoordinator, WorkflowExecutionHandle, WorkflowExecutionResult, WorkflowExecutionState,
|
WorkflowCoordinator, WorkflowExecutionHandle, WorkflowExecutionResult, WorkflowExecutionState,
|
||||||
WorkflowExecutionStatus,
|
WorkflowExecutionStatus,
|
||||||
};
|
};
|
||||||
pub use graph::{GraphError, GraphResult, TaskGraph, TaskNode, TaskTransitions};
|
pub use graph::{GraphError, GraphResult, GraphTransition, TaskGraph, TaskNode};
|
||||||
pub use task_executor::{
|
pub use task_executor::{
|
||||||
TaskExecutionError, TaskExecutionResult, TaskExecutionStatus, TaskExecutor,
|
TaskExecutionError, TaskExecutionResult, TaskExecutionStatus, TaskExecutor,
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -132,13 +132,25 @@ impl TaskExecutor {
|
|||||||
if let Some(ref output) = result.output {
|
if let Some(ref output) = result.output {
|
||||||
context.set_task_result(&task.name, output.clone());
|
context.set_task_result(&task.name, output.clone());
|
||||||
|
|
||||||
// Publish variables
|
// Publish variables from matching transitions
|
||||||
if !task.publish.is_empty() {
|
let success = matches!(result.status, TaskExecutionStatus::Success);
|
||||||
if let Err(e) = context.publish_from_result(output, &task.publish, None) {
|
for transition in &task.transitions {
|
||||||
|
let should_fire = match transition.kind() {
|
||||||
|
super::graph::TransitionKind::Succeeded => success,
|
||||||
|
super::graph::TransitionKind::Failed => !success,
|
||||||
|
super::graph::TransitionKind::TimedOut => !success,
|
||||||
|
super::graph::TransitionKind::Always => true,
|
||||||
|
super::graph::TransitionKind::Custom => true,
|
||||||
|
};
|
||||||
|
if should_fire && !transition.publish.is_empty() {
|
||||||
|
let var_names: Vec<String> =
|
||||||
|
transition.publish.iter().map(|p| p.name.clone()).collect();
|
||||||
|
if let Err(e) = context.publish_from_result(output, &var_names, None) {
|
||||||
warn!("Failed to publish variables for task {}: {}", task.name, e);
|
warn!("Failed to publish variables for task {}: {}", task.name, e);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
|
||||||
Ok(TaskExecutionResult {
|
Ok(TaskExecutionResult {
|
||||||
duration_ms,
|
duration_ms,
|
||||||
|
|||||||
@@ -165,7 +165,17 @@ impl ActionExecutor {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Otherwise, parse action_ref and query by pack.ref + action.ref
|
// Fallback: look up by the full qualified action ref directly
|
||||||
|
let action = sqlx::query_as::<_, Action>("SELECT * FROM action WHERE ref = $1")
|
||||||
|
.bind(&execution.action_ref)
|
||||||
|
.fetch_optional(&self.pool)
|
||||||
|
.await?;
|
||||||
|
|
||||||
|
if let Some(action) = action {
|
||||||
|
return Ok(action);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Final fallback: parse action_ref as "pack.action" and query by pack ref
|
||||||
let parts: Vec<&str> = execution.action_ref.split('.').collect();
|
let parts: Vec<&str> = execution.action_ref.split('.').collect();
|
||||||
if parts.len() != 2 {
|
if parts.len() != 2 {
|
||||||
return Err(Error::validation(format!(
|
return Err(Error::validation(format!(
|
||||||
@@ -175,9 +185,8 @@ impl ActionExecutor {
|
|||||||
}
|
}
|
||||||
|
|
||||||
let pack_ref = parts[0];
|
let pack_ref = parts[0];
|
||||||
let action_ref = parts[1];
|
|
||||||
|
|
||||||
// Query action by pack ref and action ref
|
// Query action by pack ref and full action ref
|
||||||
let action = sqlx::query_as::<_, Action>(
|
let action = sqlx::query_as::<_, Action>(
|
||||||
r#"
|
r#"
|
||||||
SELECT a.*
|
SELECT a.*
|
||||||
@@ -187,7 +196,7 @@ impl ActionExecutor {
|
|||||||
"#,
|
"#,
|
||||||
)
|
)
|
||||||
.bind(pack_ref)
|
.bind(pack_ref)
|
||||||
.bind(action_ref)
|
.bind(&execution.action_ref)
|
||||||
.fetch_optional(&self.pool)
|
.fetch_optional(&self.pool)
|
||||||
.await?
|
.await?
|
||||||
.ok_or_else(|| Error::not_found("Action", "ref", execution.action_ref.clone()))?;
|
.ok_or_else(|| Error::not_found("Action", "ref", execution.action_ref.clone()))?;
|
||||||
@@ -368,9 +377,40 @@ impl ActionExecutor {
|
|||||||
if action_file_path.exists() {
|
if action_file_path.exists() {
|
||||||
Some(action_file_path)
|
Some(action_file_path)
|
||||||
} else {
|
} else {
|
||||||
|
// Detailed diagnostics to help track down missing action files
|
||||||
|
let pack_dir_exists = pack_dir.exists();
|
||||||
|
let actions_dir = pack_dir.join("actions");
|
||||||
|
let actions_dir_exists = actions_dir.exists();
|
||||||
|
let actions_dir_contents: Vec<String> = if actions_dir_exists {
|
||||||
|
std::fs::read_dir(&actions_dir)
|
||||||
|
.map(|entries| {
|
||||||
|
entries
|
||||||
|
.filter_map(|e| e.ok())
|
||||||
|
.map(|e| e.file_name().to_string_lossy().to_string())
|
||||||
|
.collect()
|
||||||
|
})
|
||||||
|
.unwrap_or_default()
|
||||||
|
} else {
|
||||||
|
vec![]
|
||||||
|
};
|
||||||
|
|
||||||
warn!(
|
warn!(
|
||||||
"Action file not found at {:?} for action {}",
|
"Action file not found for action '{}': \
|
||||||
action_file_path, action.r#ref
|
expected_path={}, \
|
||||||
|
packs_base_dir={}, \
|
||||||
|
pack_ref={}, \
|
||||||
|
entrypoint={}, \
|
||||||
|
pack_dir_exists={}, \
|
||||||
|
actions_dir_exists={}, \
|
||||||
|
actions_dir_contents={:?}",
|
||||||
|
action.r#ref,
|
||||||
|
action_file_path.display(),
|
||||||
|
self.packs_base_dir.display(),
|
||||||
|
action.pack_ref,
|
||||||
|
entry_point,
|
||||||
|
pack_dir_exists,
|
||||||
|
actions_dir_exists,
|
||||||
|
actions_dir_contents,
|
||||||
);
|
);
|
||||||
None
|
None
|
||||||
}
|
}
|
||||||
@@ -567,9 +607,7 @@ impl ActionExecutor {
|
|||||||
|
|
||||||
warn!(
|
warn!(
|
||||||
"Execution {} failed without ExecutionResult - {}: {}",
|
"Execution {} failed without ExecutionResult - {}: {}",
|
||||||
execution_id,
|
execution_id, "early/catastrophic failure", err_msg
|
||||||
"early/catastrophic failure",
|
|
||||||
err_msg
|
|
||||||
);
|
);
|
||||||
|
|
||||||
// Check if stderr log exists and is non-empty from artifact storage
|
// Check if stderr log exists and is non-empty from artifact storage
|
||||||
|
|||||||
@@ -188,7 +188,7 @@ Authorization: Bearer {sensor_token}
|
|||||||
Content-Type: application/json
|
Content-Type: application/json
|
||||||
|
|
||||||
{
|
{
|
||||||
"trigger_type": "core.timer",
|
"trigger_ref": "core.timer",
|
||||||
"payload": {
|
"payload": {
|
||||||
"timestamp": "2025-01-27T12:34:56Z",
|
"timestamp": "2025-01-27T12:34:56Z",
|
||||||
"scheduled_time": "2025-01-27T12:34:56Z"
|
"scheduled_time": "2025-01-27T12:34:56Z"
|
||||||
@@ -197,6 +197,8 @@ Content-Type: application/json
|
|||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
|
> **Note**: `trigger_type` is accepted as an alias for `trigger_ref` for backward compatibility, but `trigger_ref` is the canonical field name.
|
||||||
|
|
||||||
**Important**: Sensors can only emit events for trigger types declared in their token's `metadata.trigger_types`. The API will reject event creation requests for unauthorized trigger types with a `403 Forbidden` error.
|
**Important**: Sensors can only emit events for trigger types declared in their token's `metadata.trigger_types`. The API will reject event creation requests for unauthorized trigger types with a `403 Forbidden` error.
|
||||||
|
|
||||||
### Event Payload Guidelines
|
### Event Payload Guidelines
|
||||||
|
|||||||
1
packs.external/nodejs_example
Submodule
1
packs.external/nodejs_example
Submodule
Submodule packs.external/nodejs_example added at 62c42b3996
1
packs.external/python_example
Submodule
1
packs.external/python_example
Submodule
Submodule packs.external/python_example added at 57532efabd
@@ -15,13 +15,12 @@ parameter_format: dotenv
|
|||||||
# Output format: json (structured data parsing enabled)
|
# Output format: json (structured data parsing enabled)
|
||||||
output_format: json
|
output_format: json
|
||||||
|
|
||||||
# Action parameters schema
|
# Action parameters schema (StackStorm-style with inline required/secret)
|
||||||
parameters:
|
parameters:
|
||||||
type: object
|
|
||||||
properties:
|
|
||||||
pack_paths:
|
pack_paths:
|
||||||
type: array
|
type: array
|
||||||
description: "List of pack directory paths to build environments for"
|
description: "List of pack directory paths to build environments for"
|
||||||
|
required: true
|
||||||
items:
|
items:
|
||||||
type: string
|
type: string
|
||||||
minItems: 1
|
minItems: 1
|
||||||
@@ -55,14 +54,10 @@ parameters:
|
|||||||
default: 600
|
default: 600
|
||||||
minimum: 60
|
minimum: 60
|
||||||
maximum: 3600
|
maximum: 3600
|
||||||
required:
|
|
||||||
- pack_paths
|
|
||||||
|
|
||||||
# Output schema: describes the JSON structure written to stdout
|
# Output schema: describes the JSON structure written to stdout
|
||||||
# Note: stdout/stderr/exit_code are captured automatically by the execution system
|
# Note: stdout/stderr/exit_code are captured automatically by the execution system
|
||||||
output_schema:
|
output_schema:
|
||||||
type: object
|
|
||||||
properties:
|
|
||||||
built_environments:
|
built_environments:
|
||||||
type: array
|
type: array
|
||||||
description: "List of successfully built environments"
|
description: "List of successfully built environments"
|
||||||
|
|||||||
@@ -15,19 +15,19 @@ parameter_format: dotenv
|
|||||||
# Output format: json (structured data parsing enabled)
|
# Output format: json (structured data parsing enabled)
|
||||||
output_format: json
|
output_format: json
|
||||||
|
|
||||||
# Action parameters schema
|
# Action parameters schema (StackStorm-style with inline required/secret)
|
||||||
parameters:
|
parameters:
|
||||||
type: object
|
|
||||||
properties:
|
|
||||||
packs:
|
packs:
|
||||||
type: array
|
type: array
|
||||||
description: "List of packs to download (git URLs, HTTP URLs, or pack refs)"
|
description: "List of packs to download (git URLs, HTTP URLs, or pack refs)"
|
||||||
items:
|
items:
|
||||||
type: string
|
type: string
|
||||||
minItems: 1
|
minItems: 1
|
||||||
|
required: true
|
||||||
destination_dir:
|
destination_dir:
|
||||||
type: string
|
type: string
|
||||||
description: "Destination directory for downloaded packs"
|
description: "Destination directory for downloaded packs"
|
||||||
|
required: true
|
||||||
registry_url:
|
registry_url:
|
||||||
type: string
|
type: string
|
||||||
description: "Pack registry URL for resolving pack refs (optional)"
|
description: "Pack registry URL for resolving pack refs (optional)"
|
||||||
@@ -49,15 +49,10 @@ parameters:
|
|||||||
type: string
|
type: string
|
||||||
description: "Attune API URL for making registry lookups"
|
description: "Attune API URL for making registry lookups"
|
||||||
default: "http://localhost:8080"
|
default: "http://localhost:8080"
|
||||||
required:
|
|
||||||
- packs
|
|
||||||
- destination_dir
|
|
||||||
|
|
||||||
# Output schema: describes the JSON structure written to stdout
|
# Output schema: describes the JSON structure written to stdout
|
||||||
# Note: stdout/stderr/exit_code are captured automatically by the execution system
|
# Note: stdout/stderr/exit_code are captured automatically by the execution system
|
||||||
output_schema:
|
output_schema:
|
||||||
type: object
|
|
||||||
properties:
|
|
||||||
downloaded_packs:
|
downloaded_packs:
|
||||||
type: array
|
type: array
|
||||||
description: "List of successfully downloaded packs"
|
description: "List of successfully downloaded packs"
|
||||||
|
|||||||
@@ -19,10 +19,8 @@ parameter_format: dotenv
|
|||||||
# Output format: text (no structured data parsing)
|
# Output format: text (no structured data parsing)
|
||||||
output_format: text
|
output_format: text
|
||||||
|
|
||||||
# Action parameters schema (standard JSON Schema format)
|
# Action parameters schema (StackStorm-style: inline required/secret per parameter)
|
||||||
parameters:
|
parameters:
|
||||||
type: object
|
|
||||||
properties:
|
|
||||||
message:
|
message:
|
||||||
type: string
|
type: string
|
||||||
description: "Message to echo (empty string if not provided)"
|
description: "Message to echo (empty string if not provided)"
|
||||||
|
|||||||
@@ -15,16 +15,15 @@ parameter_format: dotenv
|
|||||||
# Output format: json (structured data parsing enabled)
|
# Output format: json (structured data parsing enabled)
|
||||||
output_format: json
|
output_format: json
|
||||||
|
|
||||||
# Action parameters schema
|
# Action parameters schema (StackStorm-style with inline required/secret)
|
||||||
parameters:
|
parameters:
|
||||||
type: object
|
|
||||||
properties:
|
|
||||||
pack_paths:
|
pack_paths:
|
||||||
type: array
|
type: array
|
||||||
description: "List of pack directory paths to analyze"
|
description: "List of pack directory paths to analyze"
|
||||||
items:
|
items:
|
||||||
type: string
|
type: string
|
||||||
minItems: 1
|
minItems: 1
|
||||||
|
required: true
|
||||||
skip_validation:
|
skip_validation:
|
||||||
type: boolean
|
type: boolean
|
||||||
description: "Skip validation of pack.yaml schema"
|
description: "Skip validation of pack.yaml schema"
|
||||||
@@ -33,14 +32,10 @@ parameters:
|
|||||||
type: string
|
type: string
|
||||||
description: "Attune API URL for checking installed packs"
|
description: "Attune API URL for checking installed packs"
|
||||||
default: "http://localhost:8080"
|
default: "http://localhost:8080"
|
||||||
required:
|
|
||||||
- pack_paths
|
|
||||||
|
|
||||||
# Output schema: describes the JSON structure written to stdout
|
# Output schema: describes the JSON structure written to stdout
|
||||||
# Note: stdout/stderr/exit_code are captured automatically by the execution system
|
# Note: stdout/stderr/exit_code are captured automatically by the execution system
|
||||||
output_schema:
|
output_schema:
|
||||||
type: object
|
|
||||||
properties:
|
|
||||||
dependencies:
|
dependencies:
|
||||||
type: array
|
type: array
|
||||||
description: "List of pack dependencies that need to be installed"
|
description: "List of pack dependencies that need to be installed"
|
||||||
|
|||||||
@@ -20,13 +20,12 @@ parameter_format: dotenv
|
|||||||
# Output format: json (structured data parsing enabled)
|
# Output format: json (structured data parsing enabled)
|
||||||
output_format: json
|
output_format: json
|
||||||
|
|
||||||
# Action parameters schema (standard JSON Schema format)
|
# Action parameters schema (StackStorm-style with inline required/secret)
|
||||||
parameters:
|
parameters:
|
||||||
type: object
|
|
||||||
properties:
|
|
||||||
url:
|
url:
|
||||||
type: string
|
type: string
|
||||||
description: "URL to send the request to"
|
description: "URL to send the request to"
|
||||||
|
required: true
|
||||||
method:
|
method:
|
||||||
type: string
|
type: string
|
||||||
description: "HTTP method to use"
|
description: "HTTP method to use"
|
||||||
@@ -89,14 +88,10 @@ parameters:
|
|||||||
type: integer
|
type: integer
|
||||||
description: "Maximum number of redirects to follow"
|
description: "Maximum number of redirects to follow"
|
||||||
default: 10
|
default: 10
|
||||||
required:
|
|
||||||
- url
|
|
||||||
|
|
||||||
# Output schema: describes the JSON structure written to stdout
|
# Output schema: describes the JSON structure written to stdout
|
||||||
# Note: stdout/stderr/exit_code are captured automatically by the execution system
|
# Note: stdout/stderr/exit_code are captured automatically by the execution system
|
||||||
output_schema:
|
output_schema:
|
||||||
type: object
|
|
||||||
properties:
|
|
||||||
status_code:
|
status_code:
|
||||||
type: integer
|
type: integer
|
||||||
description: "HTTP status code"
|
description: "HTTP status code"
|
||||||
|
|||||||
@@ -19,10 +19,8 @@ parameter_format: dotenv
|
|||||||
# Output format: text (no structured data parsing)
|
# Output format: text (no structured data parsing)
|
||||||
output_format: text
|
output_format: text
|
||||||
|
|
||||||
# Action parameters schema (standard JSON Schema format)
|
# Action parameters schema (StackStorm-style inline format)
|
||||||
parameters:
|
parameters:
|
||||||
type: object
|
|
||||||
properties:
|
|
||||||
message:
|
message:
|
||||||
type: string
|
type: string
|
||||||
description: "Optional message to log (for debugging)"
|
description: "Optional message to log (for debugging)"
|
||||||
|
|||||||
@@ -15,16 +15,15 @@ parameter_format: dotenv
|
|||||||
# Output format: json (structured data parsing enabled)
|
# Output format: json (structured data parsing enabled)
|
||||||
output_format: json
|
output_format: json
|
||||||
|
|
||||||
# Action parameters schema
|
# Action parameters schema (StackStorm-style with inline required/secret)
|
||||||
parameters:
|
parameters:
|
||||||
type: object
|
|
||||||
properties:
|
|
||||||
pack_paths:
|
pack_paths:
|
||||||
type: array
|
type: array
|
||||||
description: "List of pack directory paths to register"
|
description: "List of pack directory paths to register"
|
||||||
items:
|
items:
|
||||||
type: string
|
type: string
|
||||||
minItems: 1
|
minItems: 1
|
||||||
|
required: true
|
||||||
packs_base_dir:
|
packs_base_dir:
|
||||||
type: string
|
type: string
|
||||||
description: "Base directory where packs are permanently stored"
|
description: "Base directory where packs are permanently stored"
|
||||||
@@ -49,14 +48,10 @@ parameters:
|
|||||||
type: string
|
type: string
|
||||||
description: "API authentication token"
|
description: "API authentication token"
|
||||||
secret: true
|
secret: true
|
||||||
required:
|
|
||||||
- pack_paths
|
|
||||||
|
|
||||||
# Output schema: describes the JSON structure written to stdout
|
# Output schema: describes the JSON structure written to stdout
|
||||||
# Note: stdout/stderr/exit_code are captured automatically by the execution system
|
# Note: stdout/stderr/exit_code are captured automatically by the execution system
|
||||||
output_schema:
|
output_schema:
|
||||||
type: object
|
|
||||||
properties:
|
|
||||||
registered_packs:
|
registered_packs:
|
||||||
type: array
|
type: array
|
||||||
description: "List of successfully registered packs"
|
description: "List of successfully registered packs"
|
||||||
|
|||||||
@@ -19,21 +19,18 @@ parameter_format: dotenv
|
|||||||
# Output format: text (no structured data parsing)
|
# Output format: text (no structured data parsing)
|
||||||
output_format: text
|
output_format: text
|
||||||
|
|
||||||
# Action parameters schema (standard JSON Schema format)
|
# Action parameters (StackStorm-style with inline required/secret)
|
||||||
parameters:
|
parameters:
|
||||||
type: object
|
|
||||||
properties:
|
|
||||||
seconds:
|
seconds:
|
||||||
type: integer
|
type: integer
|
||||||
description: "Number of seconds to sleep"
|
description: "Number of seconds to sleep"
|
||||||
|
required: true
|
||||||
default: 1
|
default: 1
|
||||||
minimum: 0
|
minimum: 0
|
||||||
maximum: 3600
|
maximum: 3600
|
||||||
message:
|
message:
|
||||||
type: string
|
type: string
|
||||||
description: "Optional message to display before sleeping"
|
description: "Optional message to display before sleeping"
|
||||||
required:
|
|
||||||
- seconds
|
|
||||||
|
|
||||||
# Output schema: not applicable for text output format
|
# Output schema: not applicable for text output format
|
||||||
# The action outputs plain text to stdout
|
# The action outputs plain text to stdout
|
||||||
|
|||||||
@@ -11,10 +11,8 @@ email: "core@attune.io"
|
|||||||
# Pack is a system pack (shipped with Attune)
|
# Pack is a system pack (shipped with Attune)
|
||||||
system: true
|
system: true
|
||||||
|
|
||||||
# Pack configuration schema (minimal for core pack)
|
# Pack configuration schema (StackStorm-style flat format)
|
||||||
conf_schema:
|
conf_schema:
|
||||||
type: object
|
|
||||||
properties:
|
|
||||||
max_action_timeout:
|
max_action_timeout:
|
||||||
type: integer
|
type: integer
|
||||||
description: "Maximum timeout for action execution in seconds"
|
description: "Maximum timeout for action execution in seconds"
|
||||||
|
|||||||
@@ -18,10 +18,8 @@ trigger_types:
|
|||||||
- core.crontimer
|
- core.crontimer
|
||||||
- core.datetimetimer
|
- core.datetimetimer
|
||||||
|
|
||||||
# Sensor configuration schema (standard JSON Schema format)
|
# Sensor configuration schema (StackStorm-style flat format)
|
||||||
parameters:
|
parameters:
|
||||||
type: object
|
|
||||||
properties:
|
|
||||||
check_interval_seconds:
|
check_interval_seconds:
|
||||||
type: integer
|
type: integer
|
||||||
description: "How often to check if triggers should fire (in seconds)"
|
description: "How often to check if triggers should fire (in seconds)"
|
||||||
|
|||||||
@@ -9,13 +9,12 @@ enabled: true
|
|||||||
# Trigger type
|
# Trigger type
|
||||||
type: cron
|
type: cron
|
||||||
|
|
||||||
# Parameter schema - configuration for the trigger instance (standard JSON Schema format)
|
# Parameter schema - configuration for the trigger instance (StackStorm-style with inline required/secret)
|
||||||
parameters:
|
parameters:
|
||||||
type: object
|
|
||||||
properties:
|
|
||||||
expression:
|
expression:
|
||||||
type: string
|
type: string
|
||||||
description: "Cron expression in standard format (second minute hour day month weekday)"
|
description: "Cron expression in standard format (second minute hour day month weekday)"
|
||||||
|
required: true
|
||||||
timezone:
|
timezone:
|
||||||
type: string
|
type: string
|
||||||
description: "Timezone for cron schedule (e.g., 'UTC', 'America/New_York')"
|
description: "Timezone for cron schedule (e.g., 'UTC', 'America/New_York')"
|
||||||
@@ -23,28 +22,28 @@ parameters:
|
|||||||
description:
|
description:
|
||||||
type: string
|
type: string
|
||||||
description: "Human-readable description of the schedule"
|
description: "Human-readable description of the schedule"
|
||||||
required:
|
|
||||||
- expression
|
|
||||||
|
|
||||||
# Payload schema - data emitted when trigger fires
|
# Payload schema - data emitted when trigger fires
|
||||||
output:
|
output:
|
||||||
type: object
|
|
||||||
properties:
|
|
||||||
type:
|
type:
|
||||||
type: string
|
type: string
|
||||||
const: cron
|
const: cron
|
||||||
description: "Trigger type identifier"
|
description: "Trigger type identifier"
|
||||||
|
required: true
|
||||||
fired_at:
|
fired_at:
|
||||||
type: string
|
type: string
|
||||||
format: date-time
|
format: date-time
|
||||||
description: "Timestamp when the trigger fired"
|
description: "Timestamp when the trigger fired"
|
||||||
|
required: true
|
||||||
scheduled_at:
|
scheduled_at:
|
||||||
type: string
|
type: string
|
||||||
format: date-time
|
format: date-time
|
||||||
description: "Timestamp when the trigger was scheduled to fire"
|
description: "Timestamp when the trigger was scheduled to fire"
|
||||||
|
required: true
|
||||||
expression:
|
expression:
|
||||||
type: string
|
type: string
|
||||||
description: "The cron expression that triggered this event"
|
description: "The cron expression that triggered this event"
|
||||||
|
required: true
|
||||||
timezone:
|
timezone:
|
||||||
type: string
|
type: string
|
||||||
description: "Timezone used for scheduling"
|
description: "Timezone used for scheduling"
|
||||||
@@ -58,11 +57,6 @@ output:
|
|||||||
sensor_ref:
|
sensor_ref:
|
||||||
type: string
|
type: string
|
||||||
description: "Reference to the sensor that generated this event"
|
description: "Reference to the sensor that generated this event"
|
||||||
required:
|
|
||||||
- type
|
|
||||||
- fired_at
|
|
||||||
- scheduled_at
|
|
||||||
- expression
|
|
||||||
|
|
||||||
# Tags for categorization
|
# Tags for categorization
|
||||||
tags:
|
tags:
|
||||||
|
|||||||
@@ -9,13 +9,12 @@ enabled: true
|
|||||||
# Trigger type
|
# Trigger type
|
||||||
type: one_shot
|
type: one_shot
|
||||||
|
|
||||||
# Parameter schema - configuration for the trigger instance (standard JSON Schema format)
|
# Parameter schema - configuration for the trigger instance (StackStorm-style with inline required/secret)
|
||||||
parameters:
|
parameters:
|
||||||
type: object
|
|
||||||
properties:
|
|
||||||
fire_at:
|
fire_at:
|
||||||
type: string
|
type: string
|
||||||
description: "ISO 8601 timestamp when the timer should fire (e.g., '2024-12-31T23:59:59Z')"
|
description: "ISO 8601 timestamp when the timer should fire (e.g., '2024-12-31T23:59:59Z')"
|
||||||
|
required: true
|
||||||
timezone:
|
timezone:
|
||||||
type: string
|
type: string
|
||||||
description: "Timezone for the datetime (e.g., 'UTC', 'America/New_York')"
|
description: "Timezone for the datetime (e.g., 'UTC', 'America/New_York')"
|
||||||
@@ -23,25 +22,24 @@ parameters:
|
|||||||
description:
|
description:
|
||||||
type: string
|
type: string
|
||||||
description: "Human-readable description of when this timer fires"
|
description: "Human-readable description of when this timer fires"
|
||||||
required:
|
|
||||||
- fire_at
|
|
||||||
|
|
||||||
# Payload schema - data emitted when trigger fires
|
# Payload schema - data emitted when trigger fires
|
||||||
output:
|
output:
|
||||||
type: object
|
|
||||||
properties:
|
|
||||||
type:
|
type:
|
||||||
type: string
|
type: string
|
||||||
const: one_shot
|
const: one_shot
|
||||||
description: "Trigger type identifier"
|
description: "Trigger type identifier"
|
||||||
|
required: true
|
||||||
fire_at:
|
fire_at:
|
||||||
type: string
|
type: string
|
||||||
format: date-time
|
format: date-time
|
||||||
description: "Scheduled fire time"
|
description: "Scheduled fire time"
|
||||||
|
required: true
|
||||||
fired_at:
|
fired_at:
|
||||||
type: string
|
type: string
|
||||||
format: date-time
|
format: date-time
|
||||||
description: "Actual fire time"
|
description: "Actual fire time"
|
||||||
|
required: true
|
||||||
timezone:
|
timezone:
|
||||||
type: string
|
type: string
|
||||||
description: "Timezone used for scheduling"
|
description: "Timezone used for scheduling"
|
||||||
@@ -51,10 +49,6 @@ output:
|
|||||||
sensor_ref:
|
sensor_ref:
|
||||||
type: string
|
type: string
|
||||||
description: "Reference to the sensor that generated this event"
|
description: "Reference to the sensor that generated this event"
|
||||||
required:
|
|
||||||
- type
|
|
||||||
- fire_at
|
|
||||||
- fired_at
|
|
||||||
|
|
||||||
# Tags for categorization
|
# Tags for categorization
|
||||||
tags:
|
tags:
|
||||||
|
|||||||
@@ -9,10 +9,8 @@ enabled: true
|
|||||||
# Trigger type
|
# Trigger type
|
||||||
type: interval
|
type: interval
|
||||||
|
|
||||||
# Parameter schema - configuration for the trigger instance (standard JSON Schema format)
|
# Parameter schema - configuration for the trigger instance (StackStorm-style with inline required/secret)
|
||||||
parameters:
|
parameters:
|
||||||
type: object
|
|
||||||
properties:
|
|
||||||
unit:
|
unit:
|
||||||
type: string
|
type: string
|
||||||
enum:
|
enum:
|
||||||
@@ -21,39 +19,35 @@ parameters:
|
|||||||
- hours
|
- hours
|
||||||
description: "Time unit for the interval"
|
description: "Time unit for the interval"
|
||||||
default: "seconds"
|
default: "seconds"
|
||||||
|
required: true
|
||||||
interval:
|
interval:
|
||||||
type: integer
|
type: integer
|
||||||
description: "Number of time units between each trigger"
|
description: "Number of time units between each trigger"
|
||||||
default: 60
|
default: 60
|
||||||
required:
|
required: true
|
||||||
- unit
|
|
||||||
- interval
|
|
||||||
|
|
||||||
# Payload schema - data emitted when trigger fires
|
# Payload schema - data emitted when trigger fires
|
||||||
output:
|
output:
|
||||||
type: object
|
|
||||||
properties:
|
|
||||||
type:
|
type:
|
||||||
type: string
|
type: string
|
||||||
const: interval
|
const: interval
|
||||||
description: "Trigger type identifier"
|
description: "Trigger type identifier"
|
||||||
|
required: true
|
||||||
interval_seconds:
|
interval_seconds:
|
||||||
type: integer
|
type: integer
|
||||||
description: "Total interval in seconds"
|
description: "Total interval in seconds"
|
||||||
|
required: true
|
||||||
fired_at:
|
fired_at:
|
||||||
type: string
|
type: string
|
||||||
format: date-time
|
format: date-time
|
||||||
description: "Timestamp when the trigger fired"
|
description: "Timestamp when the trigger fired"
|
||||||
|
required: true
|
||||||
execution_count:
|
execution_count:
|
||||||
type: integer
|
type: integer
|
||||||
description: "Number of times this trigger has fired"
|
description: "Number of times this trigger has fired"
|
||||||
sensor_ref:
|
sensor_ref:
|
||||||
type: string
|
type: string
|
||||||
description: "Reference to the sensor that generated this event"
|
description: "Reference to the sensor that generated this event"
|
||||||
required:
|
|
||||||
- type
|
|
||||||
- interval_seconds
|
|
||||||
- fired_at
|
|
||||||
|
|
||||||
# Tags for categorization
|
# Tags for categorization
|
||||||
tags:
|
tags:
|
||||||
|
|||||||
@@ -7,16 +7,15 @@ label: "Install Packs"
|
|||||||
description: "Install one or more packs from git repositories, HTTP archives, or pack registry with automatic dependency resolution"
|
description: "Install one or more packs from git repositories, HTTP archives, or pack registry with automatic dependency resolution"
|
||||||
version: "1.0.0"
|
version: "1.0.0"
|
||||||
|
|
||||||
# Input parameters
|
# Input parameters (StackStorm-style with inline required/secret)
|
||||||
parameters:
|
parameters:
|
||||||
type: object
|
|
||||||
properties:
|
|
||||||
packs:
|
packs:
|
||||||
type: array
|
type: array
|
||||||
description: "List of packs to install (git URLs, HTTP URLs, or pack refs like 'slack@1.0.0')"
|
description: "List of packs to install (git URLs, HTTP URLs, or pack refs like 'slack@1.0.0')"
|
||||||
items:
|
items:
|
||||||
type: string
|
type: string
|
||||||
minItems: 1
|
minItems: 1
|
||||||
|
required: true
|
||||||
ref_spec:
|
ref_spec:
|
||||||
type: string
|
type: string
|
||||||
description: "Git reference to checkout for git URLs (branch, tag, or commit)"
|
description: "Git reference to checkout for git URLs (branch, tag, or commit)"
|
||||||
@@ -54,8 +53,6 @@ parameters:
|
|||||||
default: 1800
|
default: 1800
|
||||||
minimum: 300
|
minimum: 300
|
||||||
maximum: 7200
|
maximum: 7200
|
||||||
required:
|
|
||||||
- packs
|
|
||||||
|
|
||||||
# Workflow variables
|
# Workflow variables
|
||||||
vars:
|
vars:
|
||||||
@@ -218,8 +215,6 @@ tasks:
|
|||||||
|
|
||||||
# Output schema
|
# Output schema
|
||||||
output_schema:
|
output_schema:
|
||||||
type: object
|
|
||||||
properties:
|
|
||||||
registered_packs:
|
registered_packs:
|
||||||
type: array
|
type: array
|
||||||
description: "Successfully registered packs"
|
description: "Successfully registered packs"
|
||||||
|
|||||||
@@ -19,10 +19,8 @@ parameter_format: json
|
|||||||
# Output format: jsonl (each line is a JSON object, collected into array)
|
# Output format: jsonl (each line is a JSON object, collected into array)
|
||||||
output_format: jsonl
|
output_format: jsonl
|
||||||
|
|
||||||
# Action parameters schema (standard JSON Schema format)
|
# Action parameters schema (StackStorm-style with inline required/secret)
|
||||||
parameters:
|
parameters:
|
||||||
type: object
|
|
||||||
properties:
|
|
||||||
count:
|
count:
|
||||||
type: integer
|
type: integer
|
||||||
description: "Number of items to generate"
|
description: "Number of items to generate"
|
||||||
@@ -33,22 +31,17 @@ parameters:
|
|||||||
# Output schema: array of objects (required for jsonl format)
|
# Output schema: array of objects (required for jsonl format)
|
||||||
# Each line in stdout will be parsed as JSON and collected into this array
|
# Each line in stdout will be parsed as JSON and collected into this array
|
||||||
output_schema:
|
output_schema:
|
||||||
type: array
|
|
||||||
items:
|
|
||||||
type: object
|
|
||||||
properties:
|
|
||||||
id:
|
id:
|
||||||
type: integer
|
type: integer
|
||||||
description: "Item identifier"
|
description: "Item identifier"
|
||||||
|
required: true
|
||||||
value:
|
value:
|
||||||
type: string
|
type: string
|
||||||
description: "Item value"
|
description: "Item value"
|
||||||
|
required: true
|
||||||
timestamp:
|
timestamp:
|
||||||
type: string
|
type: string
|
||||||
description: "ISO 8601 timestamp"
|
description: "ISO 8601 timestamp"
|
||||||
required:
|
|
||||||
- id
|
|
||||||
- value
|
|
||||||
|
|
||||||
# Tags for categorization
|
# Tags for categorization
|
||||||
tags:
|
tags:
|
||||||
|
|||||||
@@ -14,8 +14,6 @@ enabled: true
|
|||||||
|
|
||||||
# Configuration schema
|
# Configuration schema
|
||||||
conf_schema:
|
conf_schema:
|
||||||
type: object
|
|
||||||
properties:
|
|
||||||
example_setting:
|
example_setting:
|
||||||
type: string
|
type: string
|
||||||
description: "Example configuration setting"
|
description: "Example configuration setting"
|
||||||
|
|||||||
@@ -16,6 +16,9 @@ const PackRegisterPage = lazy(() => import("@/pages/packs/PackRegisterPage"));
|
|||||||
const PackInstallPage = lazy(() => import("@/pages/packs/PackInstallPage"));
|
const PackInstallPage = lazy(() => import("@/pages/packs/PackInstallPage"));
|
||||||
const PackEditPage = lazy(() => import("@/pages/packs/PackEditPage"));
|
const PackEditPage = lazy(() => import("@/pages/packs/PackEditPage"));
|
||||||
const ActionsPage = lazy(() => import("@/pages/actions/ActionsPage"));
|
const ActionsPage = lazy(() => import("@/pages/actions/ActionsPage"));
|
||||||
|
const WorkflowBuilderPage = lazy(
|
||||||
|
() => import("@/pages/actions/WorkflowBuilderPage"),
|
||||||
|
);
|
||||||
const RulesPage = lazy(() => import("@/pages/rules/RulesPage"));
|
const RulesPage = lazy(() => import("@/pages/rules/RulesPage"));
|
||||||
const RuleCreatePage = lazy(() => import("@/pages/rules/RuleCreatePage"));
|
const RuleCreatePage = lazy(() => import("@/pages/rules/RuleCreatePage"));
|
||||||
const RuleEditPage = lazy(() => import("@/pages/rules/RuleEditPage"));
|
const RuleEditPage = lazy(() => import("@/pages/rules/RuleEditPage"));
|
||||||
@@ -78,6 +81,14 @@ function App() {
|
|||||||
<Route path="packs/:ref" element={<PacksPage />} />
|
<Route path="packs/:ref" element={<PacksPage />} />
|
||||||
<Route path="packs/:ref/edit" element={<PackEditPage />} />
|
<Route path="packs/:ref/edit" element={<PackEditPage />} />
|
||||||
<Route path="actions" element={<ActionsPage />} />
|
<Route path="actions" element={<ActionsPage />} />
|
||||||
|
<Route
|
||||||
|
path="actions/workflows/new"
|
||||||
|
element={<WorkflowBuilderPage />}
|
||||||
|
/>
|
||||||
|
<Route
|
||||||
|
path="actions/workflows/:ref/edit"
|
||||||
|
element={<WorkflowBuilderPage />}
|
||||||
|
/>
|
||||||
<Route path="actions/:ref" element={<ActionsPage />} />
|
<Route path="actions/:ref" element={<ActionsPage />} />
|
||||||
<Route path="rules" element={<RulesPage />} />
|
<Route path="rules" element={<RulesPage />} />
|
||||||
<Route path="rules/new" element={<RuleCreatePage />} />
|
<Route path="rules/new" element={<RuleCreatePage />} />
|
||||||
|
|||||||
@@ -6,10 +6,6 @@
|
|||||||
* Request DTO for installing a pack from remote source
|
* Request DTO for installing a pack from remote source
|
||||||
*/
|
*/
|
||||||
export type InstallPackRequest = {
|
export type InstallPackRequest = {
|
||||||
/**
|
|
||||||
* Force reinstall if pack already exists
|
|
||||||
*/
|
|
||||||
force?: boolean;
|
|
||||||
/**
|
/**
|
||||||
* Git branch, tag, or commit reference
|
* Git branch, tag, or commit reference
|
||||||
*/
|
*/
|
||||||
@@ -27,4 +23,3 @@ export type InstallPackRequest = {
|
|||||||
*/
|
*/
|
||||||
source: string;
|
source: string;
|
||||||
};
|
};
|
||||||
|
|
||||||
|
|||||||
@@ -4,6 +4,7 @@ import { OpenAPI } from "@/api";
|
|||||||
import { Play, X } from "lucide-react";
|
import { Play, X } from "lucide-react";
|
||||||
import ParamSchemaForm, {
|
import ParamSchemaForm, {
|
||||||
validateParamSchema,
|
validateParamSchema,
|
||||||
|
extractProperties,
|
||||||
type ParamSchema,
|
type ParamSchema,
|
||||||
} from "@/components/common/ParamSchemaForm";
|
} from "@/components/common/ParamSchemaForm";
|
||||||
|
|
||||||
@@ -28,11 +29,11 @@ export default function ExecuteActionModal({
|
|||||||
const queryClient = useQueryClient();
|
const queryClient = useQueryClient();
|
||||||
|
|
||||||
const paramSchema: ParamSchema = (action.param_schema as ParamSchema) || {};
|
const paramSchema: ParamSchema = (action.param_schema as ParamSchema) || {};
|
||||||
|
const paramProperties = extractProperties(paramSchema);
|
||||||
|
|
||||||
// If initialParameters are provided, use them (stripping out any keys not in the schema)
|
// If initialParameters are provided, use them (stripping out any keys not in the schema)
|
||||||
const buildInitialValues = (): Record<string, any> => {
|
const buildInitialValues = (): Record<string, any> => {
|
||||||
if (!initialParameters) return {};
|
if (!initialParameters) return {};
|
||||||
const properties = paramSchema.properties || {};
|
|
||||||
const values: Record<string, any> = {};
|
const values: Record<string, any> = {};
|
||||||
// Include all initial parameters - even those not in the schema
|
// Include all initial parameters - even those not in the schema
|
||||||
// so users can see exactly what was run before
|
// so users can see exactly what was run before
|
||||||
@@ -42,7 +43,7 @@ export default function ExecuteActionModal({
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
// Also fill in defaults for any schema properties not covered
|
// Also fill in defaults for any schema properties not covered
|
||||||
for (const [key, param] of Object.entries(properties)) {
|
for (const [key, param] of Object.entries(paramProperties)) {
|
||||||
if (values[key] === undefined && param?.default !== undefined) {
|
if (values[key] === undefined && param?.default !== undefined) {
|
||||||
values[key] = param.default;
|
values[key] = param.default;
|
||||||
}
|
}
|
||||||
@@ -50,9 +51,8 @@ export default function ExecuteActionModal({
|
|||||||
return values;
|
return values;
|
||||||
};
|
};
|
||||||
|
|
||||||
const [parameters, setParameters] = useState<Record<string, any>>(
|
const [parameters, setParameters] =
|
||||||
buildInitialValues,
|
useState<Record<string, any>>(buildInitialValues);
|
||||||
);
|
|
||||||
const [paramErrors, setParamErrors] = useState<Record<string, string>>({});
|
const [paramErrors, setParamErrors] = useState<Record<string, string>>({});
|
||||||
const [envVars, setEnvVars] = useState<Array<{ key: string; value: string }>>(
|
const [envVars, setEnvVars] = useState<Array<{ key: string; value: string }>>(
|
||||||
[{ key: "", value: "" }],
|
[{ key: "", value: "" }],
|
||||||
|
|||||||
@@ -1,29 +1,12 @@
|
|||||||
/**
|
/**
|
||||||
* ParamSchemaDisplay - Read-only display component for parameters
|
* ParamSchemaDisplay - Read-only display component for parameters
|
||||||
* Shows parameter values in a human-friendly format based on their schema
|
* Shows parameter values in a human-friendly format based on their schema.
|
||||||
* Supports standard JSON Schema format (https://json-schema.org/draft/2020-12/schema)
|
* Expects StackStorm-style flat parameter format with inline required/secret.
|
||||||
*/
|
*/
|
||||||
|
|
||||||
/**
|
import type { ParamSchema } from "./ParamSchemaForm";
|
||||||
* Standard JSON Schema format for parameters
|
export type { ParamSchema };
|
||||||
*/
|
import { extractProperties } from "./ParamSchemaForm";
|
||||||
export interface ParamSchema {
|
|
||||||
type?: "object";
|
|
||||||
properties?: {
|
|
||||||
[key: string]: {
|
|
||||||
type?: "string" | "number" | "integer" | "boolean" | "array" | "object";
|
|
||||||
description?: string;
|
|
||||||
default?: any;
|
|
||||||
enum?: string[];
|
|
||||||
minimum?: number;
|
|
||||||
maximum?: number;
|
|
||||||
minLength?: number;
|
|
||||||
maxLength?: number;
|
|
||||||
secret?: boolean;
|
|
||||||
};
|
|
||||||
};
|
|
||||||
required?: string[];
|
|
||||||
}
|
|
||||||
|
|
||||||
interface ParamSchemaDisplayProps {
|
interface ParamSchemaDisplayProps {
|
||||||
schema: ParamSchema;
|
schema: ParamSchema;
|
||||||
@@ -41,8 +24,7 @@ export default function ParamSchemaDisplay({
|
|||||||
className = "",
|
className = "",
|
||||||
emptyMessage = "No parameters configured",
|
emptyMessage = "No parameters configured",
|
||||||
}: ParamSchemaDisplayProps) {
|
}: ParamSchemaDisplayProps) {
|
||||||
const properties = schema.properties || {};
|
const properties = extractProperties(schema);
|
||||||
const requiredFields = schema.required || [];
|
|
||||||
const paramEntries = Object.entries(properties);
|
const paramEntries = Object.entries(properties);
|
||||||
|
|
||||||
// Filter to only show parameters that have values
|
// Filter to only show parameters that have values
|
||||||
@@ -63,7 +45,7 @@ export default function ParamSchemaDisplay({
|
|||||||
* Check if a field is required
|
* Check if a field is required
|
||||||
*/
|
*/
|
||||||
const isRequired = (key: string): boolean => {
|
const isRequired = (key: string): boolean => {
|
||||||
return requiredFields.includes(key);
|
return !!properties[key]?.required;
|
||||||
};
|
};
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -320,7 +302,7 @@ export function ParamSchemaDisplayCompact({
|
|||||||
values,
|
values,
|
||||||
className = "",
|
className = "",
|
||||||
}: ParamSchemaDisplayProps) {
|
}: ParamSchemaDisplayProps) {
|
||||||
const properties = schema.properties || {};
|
const properties = extractProperties(schema);
|
||||||
const paramEntries = Object.entries(properties);
|
const paramEntries = Object.entries(properties);
|
||||||
const populatedParams = paramEntries.filter(([key]) => {
|
const populatedParams = paramEntries.filter(([key]) => {
|
||||||
const value = values[key];
|
const value = values[key];
|
||||||
|
|||||||
@@ -1,13 +1,17 @@
|
|||||||
import { useState, useEffect } from "react";
|
import { useState, useEffect } from "react";
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Standard JSON Schema format for parameters
|
* StackStorm-style parameter schema format.
|
||||||
* Follows https://json-schema.org/draft/2020-12/schema
|
* Parameters are defined as a flat map of parameter name to definition,
|
||||||
|
* with `required` and `secret` inlined per-parameter.
|
||||||
|
*
|
||||||
|
* Example:
|
||||||
|
* {
|
||||||
|
* "url": { "type": "string", "description": "Target URL", "required": true },
|
||||||
|
* "token": { "type": "string", "secret": true }
|
||||||
|
* }
|
||||||
*/
|
*/
|
||||||
export interface ParamSchema {
|
export interface ParamSchemaProperty {
|
||||||
type?: "object";
|
|
||||||
properties?: {
|
|
||||||
[key: string]: {
|
|
||||||
type?: "string" | "number" | "integer" | "boolean" | "array" | "object";
|
type?: "string" | "number" | "integer" | "boolean" | "array" | "object";
|
||||||
description?: string;
|
description?: string;
|
||||||
default?: any;
|
default?: any;
|
||||||
@@ -17,14 +21,39 @@ export interface ParamSchema {
|
|||||||
minLength?: number;
|
minLength?: number;
|
||||||
maxLength?: number;
|
maxLength?: number;
|
||||||
secret?: boolean;
|
secret?: boolean;
|
||||||
};
|
required?: boolean;
|
||||||
};
|
position?: number;
|
||||||
required?: string[];
|
items?: any;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ParamSchema {
|
||||||
|
[key: string]: ParamSchemaProperty;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Props for ParamSchemaForm component
|
* Props for ParamSchemaForm component
|
||||||
*/
|
*/
|
||||||
|
/**
|
||||||
|
* Extract the parameter properties from a flat parameter schema.
|
||||||
|
*
|
||||||
|
* All schemas (param_schema, out_schema, conf_schema) use the same flat format:
|
||||||
|
* { param_name: { type, description, required, secret, ... }, ... }
|
||||||
|
*/
|
||||||
|
export function extractProperties(
|
||||||
|
schema: ParamSchema | any,
|
||||||
|
): Record<string, ParamSchemaProperty> {
|
||||||
|
if (!schema || typeof schema !== "object") return {};
|
||||||
|
// StackStorm-style flat format: { param_name: { type, description, required, ... }, ... }
|
||||||
|
// Filter out entries that don't look like parameter definitions (e.g., stray "type" or "required" keys)
|
||||||
|
const props: Record<string, ParamSchemaProperty> = {};
|
||||||
|
for (const [key, value] of Object.entries(schema)) {
|
||||||
|
if (value && typeof value === "object" && !Array.isArray(value)) {
|
||||||
|
props[key] = value as ParamSchemaProperty;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return props;
|
||||||
|
}
|
||||||
|
|
||||||
interface ParamSchemaFormProps {
|
interface ParamSchemaFormProps {
|
||||||
schema: ParamSchema;
|
schema: ParamSchema;
|
||||||
values: Record<string, any>;
|
values: Record<string, any>;
|
||||||
@@ -117,8 +146,7 @@ export default function ParamSchemaForm({
|
|||||||
// Merge external and local errors
|
// Merge external and local errors
|
||||||
const allErrors = { ...localErrors, ...errors };
|
const allErrors = { ...localErrors, ...errors };
|
||||||
|
|
||||||
const properties = schema.properties || {};
|
const properties = extractProperties(schema);
|
||||||
const requiredFields = schema.required || [];
|
|
||||||
|
|
||||||
// Initialize values with defaults from schema
|
// Initialize values with defaults from schema
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
@@ -159,7 +187,7 @@ export default function ParamSchemaForm({
|
|||||||
* Check if a field is required
|
* Check if a field is required
|
||||||
*/
|
*/
|
||||||
const isRequired = (key: string): boolean => {
|
const isRequired = (key: string): boolean => {
|
||||||
return requiredFields.includes(key);
|
return !!properties[key]?.required;
|
||||||
};
|
};
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -506,15 +534,16 @@ export function validateParamSchema(
|
|||||||
allowTemplates: boolean = false,
|
allowTemplates: boolean = false,
|
||||||
): Record<string, string> {
|
): Record<string, string> {
|
||||||
const errors: Record<string, string> = {};
|
const errors: Record<string, string> = {};
|
||||||
const properties = schema.properties || {};
|
const properties = extractProperties(schema);
|
||||||
const requiredFields = schema.required || [];
|
|
||||||
|
|
||||||
// Check required fields
|
// Check required fields (inline per-parameter)
|
||||||
requiredFields.forEach((key) => {
|
Object.entries(properties).forEach(([key, param]) => {
|
||||||
|
if (param?.required) {
|
||||||
const value = values[key];
|
const value = values[key];
|
||||||
if (value === undefined || value === null || value === "") {
|
if (value === undefined || value === null || value === "") {
|
||||||
errors[key] = "This field is required";
|
errors[key] = "This field is required";
|
||||||
}
|
}
|
||||||
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
// Type-specific validation
|
// Type-specific validation
|
||||||
@@ -524,7 +553,7 @@ export function validateParamSchema(
|
|||||||
// Skip if no value and not required
|
// Skip if no value and not required
|
||||||
if (
|
if (
|
||||||
(value === undefined || value === null || value === "") &&
|
(value === undefined || value === null || value === "") &&
|
||||||
!requiredFields.includes(key)
|
!param?.required
|
||||||
) {
|
) {
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -6,6 +6,7 @@ interface SchemaProperty {
|
|||||||
type: string;
|
type: string;
|
||||||
description: string;
|
description: string;
|
||||||
required: boolean;
|
required: boolean;
|
||||||
|
secret: boolean;
|
||||||
default?: string;
|
default?: string;
|
||||||
minimum?: number;
|
minimum?: number;
|
||||||
maximum?: number;
|
maximum?: number;
|
||||||
@@ -52,18 +53,19 @@ export default function SchemaBuilder({
|
|||||||
);
|
);
|
||||||
|
|
||||||
// Initialize properties from schema value
|
// Initialize properties from schema value
|
||||||
|
// Expects StackStorm-style flat format: { param_name: { type, required, secret, ... }, ... }
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
if (value && value.properties) {
|
if (!value || typeof value !== "object") return;
|
||||||
const props: SchemaProperty[] = [];
|
const props: SchemaProperty[] = [];
|
||||||
const requiredFields = value.required || [];
|
|
||||||
|
|
||||||
Object.entries(value.properties).forEach(
|
Object.entries(value).forEach(([name, propDef]: [string, any]) => {
|
||||||
([name, propDef]: [string, any]) => {
|
if (propDef && typeof propDef === "object" && !Array.isArray(propDef)) {
|
||||||
props.push({
|
props.push({
|
||||||
name,
|
name,
|
||||||
type: propDef.type || "string",
|
type: propDef.type || "string",
|
||||||
description: propDef.description || "",
|
description: propDef.description || "",
|
||||||
required: requiredFields.includes(name),
|
required: propDef.required === true,
|
||||||
|
secret: propDef.secret === true,
|
||||||
default:
|
default:
|
||||||
propDef.default !== undefined
|
propDef.default !== undefined
|
||||||
? JSON.stringify(propDef.default)
|
? JSON.stringify(propDef.default)
|
||||||
@@ -75,9 +77,10 @@ export default function SchemaBuilder({
|
|||||||
pattern: propDef.pattern,
|
pattern: propDef.pattern,
|
||||||
enum: propDef.enum,
|
enum: propDef.enum,
|
||||||
});
|
});
|
||||||
},
|
}
|
||||||
);
|
});
|
||||||
|
|
||||||
|
if (props.length > 0) {
|
||||||
setProperties(props);
|
setProperties(props);
|
||||||
}
|
}
|
||||||
}, []);
|
}, []);
|
||||||
@@ -90,20 +93,13 @@ export default function SchemaBuilder({
|
|||||||
}
|
}
|
||||||
}, [showRawJson]);
|
}, [showRawJson]);
|
||||||
|
|
||||||
|
// Build StackStorm-style flat parameter schema
|
||||||
const buildSchema = (): Record<string, any> => {
|
const buildSchema = (): Record<string, any> => {
|
||||||
if (properties.length === 0) {
|
if (properties.length === 0) {
|
||||||
return {
|
return {};
|
||||||
type: "object",
|
|
||||||
properties: {},
|
|
||||||
required: [],
|
|
||||||
};
|
|
||||||
}
|
}
|
||||||
|
|
||||||
const schema: Record<string, any> = {
|
const schema: Record<string, any> = {};
|
||||||
type: "object",
|
|
||||||
properties: {},
|
|
||||||
required: [] as string[],
|
|
||||||
};
|
|
||||||
|
|
||||||
properties.forEach((prop) => {
|
properties.forEach((prop) => {
|
||||||
const propSchema: Record<string, any> = {
|
const propSchema: Record<string, any> = {
|
||||||
@@ -114,6 +110,14 @@ export default function SchemaBuilder({
|
|||||||
propSchema.description = prop.description;
|
propSchema.description = prop.description;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if (prop.required) {
|
||||||
|
propSchema.required = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (prop.secret) {
|
||||||
|
propSchema.secret = true;
|
||||||
|
}
|
||||||
|
|
||||||
if (prop.default !== undefined && prop.default !== "") {
|
if (prop.default !== undefined && prop.default !== "") {
|
||||||
try {
|
try {
|
||||||
propSchema.default = JSON.parse(prop.default);
|
propSchema.default = JSON.parse(prop.default);
|
||||||
@@ -135,11 +139,7 @@ export default function SchemaBuilder({
|
|||||||
if (prop.maximum !== undefined) propSchema.maximum = prop.maximum;
|
if (prop.maximum !== undefined) propSchema.maximum = prop.maximum;
|
||||||
}
|
}
|
||||||
|
|
||||||
schema.properties[prop.name] = propSchema;
|
schema[prop.name] = propSchema;
|
||||||
|
|
||||||
if (prop.required) {
|
|
||||||
schema.required.push(prop.name);
|
|
||||||
}
|
|
||||||
});
|
});
|
||||||
|
|
||||||
return schema;
|
return schema;
|
||||||
@@ -151,22 +151,15 @@ export default function SchemaBuilder({
|
|||||||
onChange(schema);
|
onChange(schema);
|
||||||
};
|
};
|
||||||
|
|
||||||
|
// Build StackStorm-style flat parameter schema from properties array
|
||||||
const buildSchemaFromProperties = (
|
const buildSchemaFromProperties = (
|
||||||
props: SchemaProperty[],
|
props: SchemaProperty[],
|
||||||
): Record<string, any> => {
|
): Record<string, any> => {
|
||||||
if (props.length === 0) {
|
if (props.length === 0) {
|
||||||
return {
|
return {};
|
||||||
type: "object",
|
|
||||||
properties: {},
|
|
||||||
required: [],
|
|
||||||
};
|
|
||||||
}
|
}
|
||||||
|
|
||||||
const schema: Record<string, any> = {
|
const schema: Record<string, any> = {};
|
||||||
type: "object",
|
|
||||||
properties: {},
|
|
||||||
required: [] as string[],
|
|
||||||
};
|
|
||||||
|
|
||||||
props.forEach((prop) => {
|
props.forEach((prop) => {
|
||||||
const propSchema: Record<string, any> = {
|
const propSchema: Record<string, any> = {
|
||||||
@@ -177,6 +170,14 @@ export default function SchemaBuilder({
|
|||||||
propSchema.description = prop.description;
|
propSchema.description = prop.description;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if (prop.required) {
|
||||||
|
propSchema.required = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (prop.secret) {
|
||||||
|
propSchema.secret = true;
|
||||||
|
}
|
||||||
|
|
||||||
if (prop.default !== undefined && prop.default !== "") {
|
if (prop.default !== undefined && prop.default !== "") {
|
||||||
try {
|
try {
|
||||||
propSchema.default = JSON.parse(prop.default);
|
propSchema.default = JSON.parse(prop.default);
|
||||||
@@ -197,11 +198,7 @@ export default function SchemaBuilder({
|
|||||||
if (prop.maximum !== undefined) propSchema.maximum = prop.maximum;
|
if (prop.maximum !== undefined) propSchema.maximum = prop.maximum;
|
||||||
}
|
}
|
||||||
|
|
||||||
schema.properties[prop.name] = propSchema;
|
schema[prop.name] = propSchema;
|
||||||
|
|
||||||
if (prop.required) {
|
|
||||||
schema.required.push(prop.name);
|
|
||||||
}
|
|
||||||
});
|
});
|
||||||
|
|
||||||
return schema;
|
return schema;
|
||||||
@@ -209,10 +206,11 @@ export default function SchemaBuilder({
|
|||||||
|
|
||||||
const addProperty = () => {
|
const addProperty = () => {
|
||||||
const newProp: SchemaProperty = {
|
const newProp: SchemaProperty = {
|
||||||
name: `property_${properties.length + 1}`,
|
name: `param${properties.length + 1}`,
|
||||||
type: "string",
|
type: "string",
|
||||||
description: "",
|
description: "",
|
||||||
required: false,
|
required: false,
|
||||||
|
secret: false,
|
||||||
};
|
};
|
||||||
const newIndex = properties.length;
|
const newIndex = properties.length;
|
||||||
handlePropertiesChange([...properties, newProp]);
|
handlePropertiesChange([...properties, newProp]);
|
||||||
@@ -258,24 +256,24 @@ export default function SchemaBuilder({
|
|||||||
|
|
||||||
try {
|
try {
|
||||||
const parsed = JSON.parse(newJson);
|
const parsed = JSON.parse(newJson);
|
||||||
if (parsed.type !== "object") {
|
if (typeof parsed !== "object" || Array.isArray(parsed)) {
|
||||||
setRawJsonError('Schema must have type "object" at root level');
|
setRawJsonError("Schema must be a JSON object");
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
onChange(parsed);
|
onChange(parsed);
|
||||||
|
|
||||||
// Update properties from parsed JSON
|
// Update properties from parsed JSON
|
||||||
|
// Expects StackStorm-style flat format: { param_name: { type, required, secret, ... }, ... }
|
||||||
const props: SchemaProperty[] = [];
|
const props: SchemaProperty[] = [];
|
||||||
const requiredFields = parsed.required || [];
|
|
||||||
|
|
||||||
if (parsed.properties) {
|
Object.entries(parsed).forEach(([name, propDef]: [string, any]) => {
|
||||||
Object.entries(parsed.properties).forEach(
|
if (propDef && typeof propDef === "object" && !Array.isArray(propDef)) {
|
||||||
([name, propDef]: [string, any]) => {
|
|
||||||
props.push({
|
props.push({
|
||||||
name,
|
name,
|
||||||
type: propDef.type || "string",
|
type: propDef.type || "string",
|
||||||
description: propDef.description || "",
|
description: propDef.description || "",
|
||||||
required: requiredFields.includes(name),
|
required: propDef.required === true,
|
||||||
|
secret: propDef.secret === true,
|
||||||
default:
|
default:
|
||||||
propDef.default !== undefined
|
propDef.default !== undefined
|
||||||
? JSON.stringify(propDef.default)
|
? JSON.stringify(propDef.default)
|
||||||
@@ -287,9 +285,8 @@ export default function SchemaBuilder({
|
|||||||
pattern: propDef.pattern,
|
pattern: propDef.pattern,
|
||||||
enum: propDef.enum,
|
enum: propDef.enum,
|
||||||
});
|
});
|
||||||
},
|
|
||||||
);
|
|
||||||
}
|
}
|
||||||
|
});
|
||||||
|
|
||||||
setProperties(props);
|
setProperties(props);
|
||||||
} catch (e: any) {
|
} catch (e: any) {
|
||||||
@@ -467,7 +464,8 @@ export default function SchemaBuilder({
|
|||||||
/>
|
/>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
{/* Required checkbox */}
|
{/* Required and Secret checkboxes */}
|
||||||
|
<div className="flex items-center gap-6">
|
||||||
<div className="flex items-center">
|
<div className="flex items-center">
|
||||||
<input
|
<input
|
||||||
type="checkbox"
|
type="checkbox"
|
||||||
@@ -480,16 +478,43 @@ export default function SchemaBuilder({
|
|||||||
}
|
}
|
||||||
disabled={disabled}
|
disabled={disabled}
|
||||||
className={`h-4 w-4 text-blue-600 focus:ring-blue-500 border-gray-300 rounded ${
|
className={`h-4 w-4 text-blue-600 focus:ring-blue-500 border-gray-300 rounded ${
|
||||||
disabled ? "cursor-not-allowed opacity-50" : ""
|
disabled
|
||||||
|
? "cursor-not-allowed opacity-50"
|
||||||
|
: ""
|
||||||
}`}
|
}`}
|
||||||
/>
|
/>
|
||||||
<label
|
<label
|
||||||
htmlFor={`required-${index}`}
|
htmlFor={`required-${index}`}
|
||||||
className="ml-2 text-xs font-medium text-gray-700"
|
className="ml-2 text-xs font-medium text-gray-700"
|
||||||
>
|
>
|
||||||
Required field
|
Required
|
||||||
</label>
|
</label>
|
||||||
</div>
|
</div>
|
||||||
|
<div className="flex items-center">
|
||||||
|
<input
|
||||||
|
type="checkbox"
|
||||||
|
id={`secret-${index}`}
|
||||||
|
checked={prop.secret}
|
||||||
|
onChange={(e) =>
|
||||||
|
updateProperty(index, {
|
||||||
|
secret: e.target.checked,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
disabled={disabled}
|
||||||
|
className={`h-4 w-4 text-yellow-600 focus:ring-yellow-500 border-gray-300 rounded ${
|
||||||
|
disabled
|
||||||
|
? "cursor-not-allowed opacity-50"
|
||||||
|
: ""
|
||||||
|
}`}
|
||||||
|
/>
|
||||||
|
<label
|
||||||
|
htmlFor={`secret-${index}`}
|
||||||
|
className="ml-2 text-xs font-medium text-gray-700"
|
||||||
|
>
|
||||||
|
Secret
|
||||||
|
</label>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
{/* Default value */}
|
{/* Default value */}
|
||||||
<div>
|
<div>
|
||||||
|
|||||||
@@ -18,11 +18,7 @@ export default function PackForm({ pack, onSuccess, onCancel }: PackFormProps) {
|
|||||||
const isEditing = !!pack;
|
const isEditing = !!pack;
|
||||||
|
|
||||||
// Store initial/database state for reset
|
// Store initial/database state for reset
|
||||||
const initialConfSchema = pack?.conf_schema || {
|
const initialConfSchema = pack?.conf_schema || {};
|
||||||
type: "object",
|
|
||||||
properties: {},
|
|
||||||
required: [],
|
|
||||||
};
|
|
||||||
const initialConfig = pack?.config || {};
|
const initialConfig = pack?.config || {};
|
||||||
|
|
||||||
// Form state
|
// Form state
|
||||||
@@ -47,15 +43,17 @@ export default function PackForm({ pack, onSuccess, onCancel }: PackFormProps) {
|
|||||||
const createPack = useCreatePack();
|
const createPack = useCreatePack();
|
||||||
const updatePack = useUpdatePack();
|
const updatePack = useUpdatePack();
|
||||||
|
|
||||||
// Check if schema has properties
|
// Check if schema has properties (flat format: each key is a parameter name)
|
||||||
const hasSchemaProperties =
|
const hasSchemaProperties =
|
||||||
confSchema?.properties && Object.keys(confSchema.properties).length > 0;
|
confSchema &&
|
||||||
|
typeof confSchema === "object" &&
|
||||||
|
Object.keys(confSchema).length > 0;
|
||||||
|
|
||||||
// Sync config values when schema changes (for ad-hoc packs only)
|
// Sync config values when schema changes (for ad-hoc packs only)
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
if (!isStandard && hasSchemaProperties) {
|
if (!isStandard && hasSchemaProperties) {
|
||||||
// Get current schema property names
|
// Get current schema property names (flat format: keys are parameter names)
|
||||||
const schemaKeys = Object.keys(confSchema.properties || {});
|
const schemaKeys = Object.keys(confSchema);
|
||||||
|
|
||||||
// Create new config with only keys that exist in schema
|
// Create new config with only keys that exist in schema
|
||||||
const syncedConfig: Record<string, any> = {};
|
const syncedConfig: Record<string, any> = {};
|
||||||
@@ -65,7 +63,7 @@ export default function PackForm({ pack, onSuccess, onCancel }: PackFormProps) {
|
|||||||
syncedConfig[key] = configValues[key];
|
syncedConfig[key] = configValues[key];
|
||||||
} else {
|
} else {
|
||||||
// Use default from schema if available
|
// Use default from schema if available
|
||||||
const defaultValue = confSchema.properties[key]?.default;
|
const defaultValue = confSchema[key]?.default;
|
||||||
if (defaultValue !== undefined) {
|
if (defaultValue !== undefined) {
|
||||||
syncedConfig[key] = defaultValue;
|
syncedConfig[key] = defaultValue;
|
||||||
}
|
}
|
||||||
@@ -99,10 +97,14 @@ export default function PackForm({ pack, onSuccess, onCancel }: PackFormProps) {
|
|||||||
newErrors.version = "Version is required";
|
newErrors.version = "Version is required";
|
||||||
}
|
}
|
||||||
|
|
||||||
// Validate conf_schema
|
// Validate conf_schema (flat format: each value should be an object defining a parameter)
|
||||||
if (confSchema && confSchema.type !== "object") {
|
if (confSchema && typeof confSchema === "object") {
|
||||||
newErrors.confSchema =
|
for (const [key, val] of Object.entries(confSchema)) {
|
||||||
'Config schema must have type "object" at root level';
|
if (!val || typeof val !== "object" || Array.isArray(val)) {
|
||||||
|
newErrors.confSchema = `Invalid parameter definition for "${key}" — each parameter must be an object`;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Validate meta JSON
|
// Validate meta JSON
|
||||||
@@ -126,7 +128,7 @@ export default function PackForm({ pack, onSuccess, onCancel }: PackFormProps) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
const parsedConfSchema =
|
const parsedConfSchema =
|
||||||
Object.keys(confSchema.properties || {}).length > 0 ? confSchema : {};
|
Object.keys(confSchema || {}).length > 0 ? confSchema : {};
|
||||||
const parsedMeta = meta.trim() ? JSON.parse(meta) : {};
|
const parsedMeta = meta.trim() ? JSON.parse(meta) : {};
|
||||||
const tagsList = tags
|
const tagsList = tags
|
||||||
.split(",")
|
.split(",")
|
||||||
@@ -201,34 +203,31 @@ export default function PackForm({ pack, onSuccess, onCancel }: PackFormProps) {
|
|||||||
};
|
};
|
||||||
|
|
||||||
const insertSchemaExample = (type: "api" | "database" | "webhook") => {
|
const insertSchemaExample = (type: "api" | "database" | "webhook") => {
|
||||||
let example;
|
let example: Record<string, any>;
|
||||||
switch (type) {
|
switch (type) {
|
||||||
case "api":
|
case "api":
|
||||||
example = {
|
example = {
|
||||||
type: "object",
|
|
||||||
properties: {
|
|
||||||
api_key: {
|
api_key: {
|
||||||
type: "string",
|
type: "string",
|
||||||
description: "API authentication key",
|
description: "API authentication key",
|
||||||
|
required: true,
|
||||||
|
secret: true,
|
||||||
},
|
},
|
||||||
endpoint: {
|
endpoint: {
|
||||||
type: "string",
|
type: "string",
|
||||||
description: "API endpoint URL",
|
description: "API endpoint URL",
|
||||||
default: "https://api.example.com",
|
default: "https://api.example.com",
|
||||||
},
|
},
|
||||||
},
|
|
||||||
required: ["api_key"],
|
|
||||||
};
|
};
|
||||||
break;
|
break;
|
||||||
|
|
||||||
case "database":
|
case "database":
|
||||||
example = {
|
example = {
|
||||||
type: "object",
|
|
||||||
properties: {
|
|
||||||
host: {
|
host: {
|
||||||
type: "string",
|
type: "string",
|
||||||
description: "Database host",
|
description: "Database host",
|
||||||
default: "localhost",
|
default: "localhost",
|
||||||
|
required: true,
|
||||||
},
|
},
|
||||||
port: {
|
port: {
|
||||||
type: "integer",
|
type: "integer",
|
||||||
@@ -238,31 +237,33 @@ export default function PackForm({ pack, onSuccess, onCancel }: PackFormProps) {
|
|||||||
database: {
|
database: {
|
||||||
type: "string",
|
type: "string",
|
||||||
description: "Database name",
|
description: "Database name",
|
||||||
|
required: true,
|
||||||
},
|
},
|
||||||
username: {
|
username: {
|
||||||
type: "string",
|
type: "string",
|
||||||
description: "Database username",
|
description: "Database username",
|
||||||
|
required: true,
|
||||||
},
|
},
|
||||||
password: {
|
password: {
|
||||||
type: "string",
|
type: "string",
|
||||||
description: "Database password",
|
description: "Database password",
|
||||||
|
required: true,
|
||||||
|
secret: true,
|
||||||
},
|
},
|
||||||
},
|
|
||||||
required: ["host", "database", "username", "password"],
|
|
||||||
};
|
};
|
||||||
break;
|
break;
|
||||||
|
|
||||||
case "webhook":
|
case "webhook":
|
||||||
example = {
|
example = {
|
||||||
type: "object",
|
|
||||||
properties: {
|
|
||||||
webhook_url: {
|
webhook_url: {
|
||||||
type: "string",
|
type: "string",
|
||||||
description: "Webhook destination URL",
|
description: "Webhook destination URL",
|
||||||
|
required: true,
|
||||||
},
|
},
|
||||||
auth_token: {
|
auth_token: {
|
||||||
type: "string",
|
type: "string",
|
||||||
description: "Authentication token",
|
description: "Authentication token",
|
||||||
|
secret: true,
|
||||||
},
|
},
|
||||||
timeout: {
|
timeout: {
|
||||||
type: "integer",
|
type: "integer",
|
||||||
@@ -271,8 +272,6 @@ export default function PackForm({ pack, onSuccess, onCancel }: PackFormProps) {
|
|||||||
maximum: 300,
|
maximum: 300,
|
||||||
default: 30,
|
default: 30,
|
||||||
},
|
},
|
||||||
},
|
|
||||||
required: ["webhook_url"],
|
|
||||||
};
|
};
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
@@ -282,15 +281,11 @@ export default function PackForm({ pack, onSuccess, onCancel }: PackFormProps) {
|
|||||||
|
|
||||||
// Immediately sync config values with schema defaults
|
// Immediately sync config values with schema defaults
|
||||||
const syncedConfig: Record<string, any> = {};
|
const syncedConfig: Record<string, any> = {};
|
||||||
if (example.properties) {
|
Object.entries(example).forEach(([key, propDef]: [string, any]) => {
|
||||||
Object.entries(example.properties).forEach(
|
|
||||||
([key, propDef]: [string, any]) => {
|
|
||||||
if (propDef.default !== undefined) {
|
if (propDef.default !== undefined) {
|
||||||
syncedConfig[key] = propDef.default;
|
syncedConfig[key] = propDef.default;
|
||||||
}
|
}
|
||||||
},
|
});
|
||||||
);
|
|
||||||
}
|
|
||||||
setConfigValues(syncedConfig);
|
setConfigValues(syncedConfig);
|
||||||
};
|
};
|
||||||
|
|
||||||
@@ -578,7 +573,7 @@ export default function PackForm({ pack, onSuccess, onCancel }: PackFormProps) {
|
|||||||
</p>
|
</p>
|
||||||
</div>
|
</div>
|
||||||
<ParamSchemaForm
|
<ParamSchemaForm
|
||||||
schema={confSchema.properties}
|
schema={confSchema}
|
||||||
values={configValues}
|
values={configValues}
|
||||||
onChange={setConfigValues}
|
onChange={setConfigValues}
|
||||||
errors={errors}
|
errors={errors}
|
||||||
|
|||||||
@@ -123,6 +123,10 @@ export default function RuleForm({ rule, onSuccess, onCancel }: RuleFormProps) {
|
|||||||
newErrors.label = "Label is required";
|
newErrors.label = "Label is required";
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if (!description.trim()) {
|
||||||
|
newErrors.description = "Description is required";
|
||||||
|
}
|
||||||
|
|
||||||
if (!packId) {
|
if (!packId) {
|
||||||
newErrors.pack = "Pack is required";
|
newErrors.pack = "Pack is required";
|
||||||
}
|
}
|
||||||
@@ -347,7 +351,7 @@ export default function RuleForm({ rule, onSuccess, onCancel }: RuleFormProps) {
|
|||||||
htmlFor="description"
|
htmlFor="description"
|
||||||
className="block text-sm font-medium text-gray-700 mb-1"
|
className="block text-sm font-medium text-gray-700 mb-1"
|
||||||
>
|
>
|
||||||
Description
|
Description <span className="text-red-500">*</span>
|
||||||
</label>
|
</label>
|
||||||
<textarea
|
<textarea
|
||||||
id="description"
|
id="description"
|
||||||
@@ -355,8 +359,13 @@ export default function RuleForm({ rule, onSuccess, onCancel }: RuleFormProps) {
|
|||||||
onChange={(e) => setDescription(e.target.value)}
|
onChange={(e) => setDescription(e.target.value)}
|
||||||
placeholder="Describe what this rule does..."
|
placeholder="Describe what this rule does..."
|
||||||
rows={3}
|
rows={3}
|
||||||
className="w-full px-3 py-2 border border-gray-300 rounded-lg focus:outline-none focus:ring-2 focus:ring-blue-500"
|
className={`w-full px-3 py-2 border rounded-lg focus:outline-none focus:ring-2 focus:ring-blue-500 ${
|
||||||
|
errors.description ? "border-red-500" : "border-gray-300"
|
||||||
|
}`}
|
||||||
/>
|
/>
|
||||||
|
{errors.description && (
|
||||||
|
<p className="mt-1 text-sm text-red-600">{errors.description}</p>
|
||||||
|
)}
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
{/* Enabled Toggle */}
|
{/* Enabled Toggle */}
|
||||||
|
|||||||
@@ -30,16 +30,8 @@ export default function TriggerForm({
|
|||||||
const [description, setDescription] = useState("");
|
const [description, setDescription] = useState("");
|
||||||
const [webhookEnabled, setWebhookEnabled] = useState(false);
|
const [webhookEnabled, setWebhookEnabled] = useState(false);
|
||||||
const [enabled, setEnabled] = useState(true);
|
const [enabled, setEnabled] = useState(true);
|
||||||
const [paramSchema, setParamSchema] = useState<Record<string, any>>({
|
const [paramSchema, setParamSchema] = useState<Record<string, any>>({});
|
||||||
type: "object",
|
const [outSchema, setOutSchema] = useState<Record<string, any>>({});
|
||||||
properties: {},
|
|
||||||
required: [],
|
|
||||||
});
|
|
||||||
const [outSchema, setOutSchema] = useState<Record<string, any>>({
|
|
||||||
type: "object",
|
|
||||||
properties: {},
|
|
||||||
required: [],
|
|
||||||
});
|
|
||||||
const [errors, setErrors] = useState<Record<string, string>>({});
|
const [errors, setErrors] = useState<Record<string, string>>({});
|
||||||
|
|
||||||
// Fetch packs
|
// Fetch packs
|
||||||
@@ -58,20 +50,8 @@ export default function TriggerForm({
|
|||||||
setDescription(initialData.description || "");
|
setDescription(initialData.description || "");
|
||||||
setWebhookEnabled(initialData.webhook_enabled || false);
|
setWebhookEnabled(initialData.webhook_enabled || false);
|
||||||
setEnabled(initialData.enabled ?? true);
|
setEnabled(initialData.enabled ?? true);
|
||||||
setParamSchema(
|
setParamSchema(initialData.param_schema || {});
|
||||||
initialData.param_schema || {
|
setOutSchema(initialData.out_schema || {});
|
||||||
type: "object",
|
|
||||||
properties: {},
|
|
||||||
required: [],
|
|
||||||
},
|
|
||||||
);
|
|
||||||
setOutSchema(
|
|
||||||
initialData.out_schema || {
|
|
||||||
type: "object",
|
|
||||||
properties: {},
|
|
||||||
required: [],
|
|
||||||
},
|
|
||||||
);
|
|
||||||
|
|
||||||
if (isEditing) {
|
if (isEditing) {
|
||||||
// Find pack by pack_ref
|
// Find pack by pack_ref
|
||||||
@@ -129,13 +109,8 @@ export default function TriggerForm({
|
|||||||
description: description.trim() || undefined,
|
description: description.trim() || undefined,
|
||||||
enabled,
|
enabled,
|
||||||
param_schema:
|
param_schema:
|
||||||
Object.keys(paramSchema.properties || {}).length > 0
|
Object.keys(paramSchema).length > 0 ? paramSchema : undefined,
|
||||||
? paramSchema
|
out_schema: Object.keys(outSchema).length > 0 ? outSchema : undefined,
|
||||||
: undefined,
|
|
||||||
out_schema:
|
|
||||||
Object.keys(outSchema.properties || {}).length > 0
|
|
||||||
? outSchema
|
|
||||||
: undefined,
|
|
||||||
};
|
};
|
||||||
|
|
||||||
if (isEditing && initialData?.ref) {
|
if (isEditing && initialData?.ref) {
|
||||||
|
|||||||
168
web/src/components/workflows/ActionPalette.tsx
Normal file
168
web/src/components/workflows/ActionPalette.tsx
Normal file
@@ -0,0 +1,168 @@
|
|||||||
|
import { useState, useMemo } from "react";
|
||||||
|
import { Search, X, ChevronDown, ChevronRight, GripVertical } from "lucide-react";
|
||||||
|
import type { PaletteAction } from "@/types/workflow";
|
||||||
|
|
||||||
|
interface ActionPaletteProps {
|
||||||
|
actions: PaletteAction[];
|
||||||
|
isLoading: boolean;
|
||||||
|
onAddTask: (action: PaletteAction) => void;
|
||||||
|
}
|
||||||
|
|
||||||
|
export default function ActionPalette({
|
||||||
|
actions,
|
||||||
|
isLoading,
|
||||||
|
onAddTask,
|
||||||
|
}: ActionPaletteProps) {
|
||||||
|
const [searchQuery, setSearchQuery] = useState("");
|
||||||
|
const [collapsedPacks, setCollapsedPacks] = useState<Set<string>>(new Set());
|
||||||
|
|
||||||
|
const filteredActions = useMemo(() => {
|
||||||
|
if (!searchQuery.trim()) return actions;
|
||||||
|
const query = searchQuery.toLowerCase();
|
||||||
|
return actions.filter(
|
||||||
|
(action) =>
|
||||||
|
action.label?.toLowerCase().includes(query) ||
|
||||||
|
action.ref?.toLowerCase().includes(query) ||
|
||||||
|
action.description?.toLowerCase().includes(query) ||
|
||||||
|
action.pack_ref?.toLowerCase().includes(query)
|
||||||
|
);
|
||||||
|
}, [actions, searchQuery]);
|
||||||
|
|
||||||
|
const actionsByPack = useMemo(() => {
|
||||||
|
const grouped = new Map<string, PaletteAction[]>();
|
||||||
|
filteredActions.forEach((action) => {
|
||||||
|
const packRef = action.pack_ref;
|
||||||
|
if (!grouped.has(packRef)) {
|
||||||
|
grouped.set(packRef, []);
|
||||||
|
}
|
||||||
|
grouped.get(packRef)!.push(action);
|
||||||
|
});
|
||||||
|
return new Map(
|
||||||
|
[...grouped.entries()].sort((a, b) => a[0].localeCompare(b[0]))
|
||||||
|
);
|
||||||
|
}, [filteredActions]);
|
||||||
|
|
||||||
|
const togglePack = (packRef: string) => {
|
||||||
|
setCollapsedPacks((prev) => {
|
||||||
|
const next = new Set(prev);
|
||||||
|
if (next.has(packRef)) {
|
||||||
|
next.delete(packRef);
|
||||||
|
} else {
|
||||||
|
next.add(packRef);
|
||||||
|
}
|
||||||
|
return next;
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="w-64 border-r border-gray-200 bg-gray-50 flex flex-col h-full overflow-hidden">
|
||||||
|
<div className="p-3 border-b border-gray-200 bg-white flex-shrink-0">
|
||||||
|
<h3 className="text-sm font-semibold text-gray-700 uppercase tracking-wider mb-2">
|
||||||
|
Action Palette
|
||||||
|
</h3>
|
||||||
|
<div className="relative">
|
||||||
|
<div className="absolute inset-y-0 left-0 pl-2 flex items-center pointer-events-none">
|
||||||
|
<Search className="h-3.5 w-3.5 text-gray-400" />
|
||||||
|
</div>
|
||||||
|
<input
|
||||||
|
type="text"
|
||||||
|
value={searchQuery}
|
||||||
|
onChange={(e) => setSearchQuery(e.target.value)}
|
||||||
|
placeholder="Search actions..."
|
||||||
|
className="block w-full pl-8 pr-8 py-1.5 border border-gray-300 rounded text-xs focus:ring-2 focus:ring-blue-500 focus:border-blue-500"
|
||||||
|
/>
|
||||||
|
{searchQuery && (
|
||||||
|
<button
|
||||||
|
onClick={() => setSearchQuery("")}
|
||||||
|
className="absolute inset-y-0 right-0 pr-2 flex items-center"
|
||||||
|
>
|
||||||
|
<X className="h-3.5 w-3.5 text-gray-400 hover:text-gray-600" />
|
||||||
|
</button>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="flex-1 overflow-y-auto p-2">
|
||||||
|
{isLoading ? (
|
||||||
|
<div className="flex items-center justify-center py-8">
|
||||||
|
<div className="animate-spin rounded-full h-6 w-6 border-b-2 border-blue-600" />
|
||||||
|
</div>
|
||||||
|
) : actions.length === 0 ? (
|
||||||
|
<div className="text-center py-8 text-xs text-gray-500">
|
||||||
|
No actions available
|
||||||
|
</div>
|
||||||
|
) : filteredActions.length === 0 ? (
|
||||||
|
<div className="text-center py-8">
|
||||||
|
<p className="text-xs text-gray-500">No actions match your search</p>
|
||||||
|
<button
|
||||||
|
onClick={() => setSearchQuery("")}
|
||||||
|
className="mt-1 text-xs text-blue-600 hover:text-blue-800"
|
||||||
|
>
|
||||||
|
Clear search
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<div className="space-y-1">
|
||||||
|
{Array.from(actionsByPack.entries()).map(
|
||||||
|
([packRef, packActions]) => {
|
||||||
|
const isCollapsed = collapsedPacks.has(packRef);
|
||||||
|
return (
|
||||||
|
<div key={packRef} className="rounded overflow-hidden">
|
||||||
|
<button
|
||||||
|
onClick={() => togglePack(packRef)}
|
||||||
|
className="w-full px-2 py-1.5 flex items-center justify-between hover:bg-gray-100 transition-colors text-left"
|
||||||
|
>
|
||||||
|
<div className="flex items-center gap-1.5">
|
||||||
|
{isCollapsed ? (
|
||||||
|
<ChevronRight className="w-3 h-3 text-gray-500 flex-shrink-0" />
|
||||||
|
) : (
|
||||||
|
<ChevronDown className="w-3 h-3 text-gray-500 flex-shrink-0" />
|
||||||
|
)}
|
||||||
|
<span className="font-semibold text-xs text-gray-800 truncate">
|
||||||
|
{packRef}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
<span className="text-[10px] text-gray-500 bg-gray-200 px-1.5 py-0.5 rounded flex-shrink-0">
|
||||||
|
{packActions.length}
|
||||||
|
</span>
|
||||||
|
</button>
|
||||||
|
|
||||||
|
{!isCollapsed && (
|
||||||
|
<div className="pl-1 pb-1">
|
||||||
|
{packActions.map((action) => (
|
||||||
|
<button
|
||||||
|
key={action.id}
|
||||||
|
onClick={() => onAddTask(action)}
|
||||||
|
className="w-full text-left px-2 py-1.5 rounded hover:bg-blue-50 hover:border-blue-200 border border-transparent transition-colors group cursor-pointer"
|
||||||
|
title={`Click to add "${action.label}" as a task`}
|
||||||
|
>
|
||||||
|
<div className="flex items-start gap-1.5">
|
||||||
|
<GripVertical className="w-3 h-3 text-gray-300 group-hover:text-blue-400 mt-0.5 flex-shrink-0" />
|
||||||
|
<div className="min-w-0 flex-1">
|
||||||
|
<div className="font-medium text-xs text-gray-900 truncate">
|
||||||
|
{action.label}
|
||||||
|
</div>
|
||||||
|
<div className="font-mono text-[10px] text-gray-500 truncate">
|
||||||
|
{action.ref}
|
||||||
|
</div>
|
||||||
|
{action.description && (
|
||||||
|
<div className="text-[10px] text-gray-400 truncate mt-0.5">
|
||||||
|
{action.description}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</button>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
1143
web/src/components/workflows/TaskInspector.tsx
Normal file
1143
web/src/components/workflows/TaskInspector.tsx
Normal file
File diff suppressed because it is too large
Load Diff
417
web/src/components/workflows/TaskNode.tsx
Normal file
417
web/src/components/workflows/TaskNode.tsx
Normal file
@@ -0,0 +1,417 @@
|
|||||||
|
import { memo, useCallback, useRef, useState } from "react";
|
||||||
|
import { Trash2, Settings, GripVertical } from "lucide-react";
|
||||||
|
import type { WorkflowTask, TransitionPreset } from "@/types/workflow";
|
||||||
|
import {
|
||||||
|
PRESET_LABELS,
|
||||||
|
PRESET_WHEN,
|
||||||
|
classifyTransitionWhen,
|
||||||
|
} from "@/types/workflow";
|
||||||
|
|
||||||
|
export type { TransitionPreset };
|
||||||
|
|
||||||
|
interface TaskNodeProps {
|
||||||
|
task: WorkflowTask;
|
||||||
|
isSelected: boolean;
|
||||||
|
allTaskNames: string[];
|
||||||
|
onSelect: (taskId: string) => void;
|
||||||
|
onDelete: (taskId: string) => void;
|
||||||
|
onPositionChange: (
|
||||||
|
taskId: string,
|
||||||
|
position: { x: number; y: number },
|
||||||
|
) => void;
|
||||||
|
onStartConnection: (taskId: string, preset: TransitionPreset) => void;
|
||||||
|
connectingFrom: { taskId: string; preset: TransitionPreset } | null;
|
||||||
|
onCompleteConnection: (targetTaskId: string) => void;
|
||||||
|
}
|
||||||
|
|
||||||
|
/** Handle visual configuration for each transition preset */
|
||||||
|
const HANDLE_CONFIG: {
|
||||||
|
preset: TransitionPreset;
|
||||||
|
color: string;
|
||||||
|
hoverColor: string;
|
||||||
|
activeColor: string;
|
||||||
|
ringColor: string;
|
||||||
|
}[] = [
|
||||||
|
{
|
||||||
|
preset: "succeeded",
|
||||||
|
color: "#22c55e",
|
||||||
|
hoverColor: "#16a34a",
|
||||||
|
activeColor: "#15803d",
|
||||||
|
ringColor: "rgba(34, 197, 94, 0.3)",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
preset: "failed",
|
||||||
|
color: "#ef4444",
|
||||||
|
hoverColor: "#dc2626",
|
||||||
|
activeColor: "#b91c1c",
|
||||||
|
ringColor: "rgba(239, 68, 68, 0.3)",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
preset: "always",
|
||||||
|
color: "#6b7280",
|
||||||
|
hoverColor: "#4b5563",
|
||||||
|
activeColor: "#374151",
|
||||||
|
ringColor: "rgba(107, 114, 128, 0.3)",
|
||||||
|
},
|
||||||
|
];
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if a task has an active transition matching a given preset.
|
||||||
|
*/
|
||||||
|
function hasActiveTransition(
|
||||||
|
task: WorkflowTask,
|
||||||
|
preset: TransitionPreset,
|
||||||
|
): boolean {
|
||||||
|
if (!task.next) return false;
|
||||||
|
const whenExpr = PRESET_WHEN[preset];
|
||||||
|
return task.next.some((t) => {
|
||||||
|
if (whenExpr === undefined) return t.when === undefined;
|
||||||
|
return (
|
||||||
|
t.when?.toLowerCase().replace(/\s+/g, "") ===
|
||||||
|
whenExpr.toLowerCase().replace(/\s+/g, "")
|
||||||
|
);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Compute a short summary of outgoing transitions for the node body.
|
||||||
|
*/
|
||||||
|
function transitionSummary(task: WorkflowTask): string | null {
|
||||||
|
if (!task.next || task.next.length === 0) return null;
|
||||||
|
const totalTargets = task.next.reduce(
|
||||||
|
(sum, t) => sum + (t.do?.length ?? 0),
|
||||||
|
0,
|
||||||
|
);
|
||||||
|
if (
|
||||||
|
totalTargets === 0 &&
|
||||||
|
task.next.some((t) => t.publish && t.publish.length > 0)
|
||||||
|
) {
|
||||||
|
return `${task.next.length} transition${task.next.length !== 1 ? "s" : ""} (publish only)`;
|
||||||
|
}
|
||||||
|
if (totalTargets === 0) return null;
|
||||||
|
return `${totalTargets} target${totalTargets !== 1 ? "s" : ""} via ${task.next.length} transition${task.next.length !== 1 ? "s" : ""}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
function TaskNodeInner({
|
||||||
|
task,
|
||||||
|
isSelected,
|
||||||
|
onSelect,
|
||||||
|
onDelete,
|
||||||
|
onPositionChange,
|
||||||
|
onStartConnection,
|
||||||
|
connectingFrom,
|
||||||
|
onCompleteConnection,
|
||||||
|
}: TaskNodeProps) {
|
||||||
|
const nodeRef = useRef<HTMLDivElement>(null);
|
||||||
|
const [isDragging, setIsDragging] = useState(false);
|
||||||
|
const [hoveredHandle, setHoveredHandle] = useState<TransitionPreset | null>(
|
||||||
|
null,
|
||||||
|
);
|
||||||
|
const [isInputHandleHovered, setIsInputHandleHovered] = useState(false);
|
||||||
|
const dragOffset = useRef({ x: 0, y: 0 });
|
||||||
|
|
||||||
|
const handleMouseDown = useCallback(
|
||||||
|
(e: React.MouseEvent) => {
|
||||||
|
const target = e.target as HTMLElement;
|
||||||
|
if (target.closest("[data-action-button]")) return;
|
||||||
|
if (target.closest("[data-handle]")) return;
|
||||||
|
|
||||||
|
e.stopPropagation();
|
||||||
|
setIsDragging(true);
|
||||||
|
dragOffset.current = {
|
||||||
|
x: e.clientX - task.position.x,
|
||||||
|
y: e.clientY - task.position.y,
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleMouseMove = (moveEvent: MouseEvent) => {
|
||||||
|
const newX = moveEvent.clientX - dragOffset.current.x;
|
||||||
|
const newY = moveEvent.clientY - dragOffset.current.y;
|
||||||
|
onPositionChange(task.id, {
|
||||||
|
x: Math.max(0, newX),
|
||||||
|
y: Math.max(0, newY),
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleMouseUp = () => {
|
||||||
|
setIsDragging(false);
|
||||||
|
document.removeEventListener("mousemove", handleMouseMove);
|
||||||
|
document.removeEventListener("mouseup", handleMouseUp);
|
||||||
|
};
|
||||||
|
|
||||||
|
document.addEventListener("mousemove", handleMouseMove);
|
||||||
|
document.addEventListener("mouseup", handleMouseUp);
|
||||||
|
},
|
||||||
|
[task.id, task.position.x, task.position.y, onPositionChange],
|
||||||
|
);
|
||||||
|
|
||||||
|
const handleClick = useCallback(
|
||||||
|
(e: React.MouseEvent) => {
|
||||||
|
e.stopPropagation();
|
||||||
|
if (connectingFrom && connectingFrom.taskId !== task.id) {
|
||||||
|
onCompleteConnection(task.id);
|
||||||
|
} else if (!connectingFrom) {
|
||||||
|
onSelect(task.id);
|
||||||
|
}
|
||||||
|
},
|
||||||
|
[task.id, onSelect, connectingFrom, onCompleteConnection],
|
||||||
|
);
|
||||||
|
|
||||||
|
const handleDelete = useCallback(
|
||||||
|
(e: React.MouseEvent) => {
|
||||||
|
e.stopPropagation();
|
||||||
|
onDelete(task.id);
|
||||||
|
},
|
||||||
|
[task.id, onDelete],
|
||||||
|
);
|
||||||
|
|
||||||
|
const handleHandleMouseDown = useCallback(
|
||||||
|
(e: React.MouseEvent, preset: TransitionPreset) => {
|
||||||
|
e.stopPropagation();
|
||||||
|
e.preventDefault();
|
||||||
|
onStartConnection(task.id, preset);
|
||||||
|
},
|
||||||
|
[task.id, onStartConnection],
|
||||||
|
);
|
||||||
|
|
||||||
|
const handleInputHandleMouseUp = useCallback(
|
||||||
|
(e: React.MouseEvent) => {
|
||||||
|
e.stopPropagation();
|
||||||
|
if (connectingFrom && connectingFrom.taskId !== task.id) {
|
||||||
|
onCompleteConnection(task.id);
|
||||||
|
}
|
||||||
|
},
|
||||||
|
[task.id, connectingFrom, onCompleteConnection],
|
||||||
|
);
|
||||||
|
|
||||||
|
const isConnectionTarget =
|
||||||
|
connectingFrom !== null && connectingFrom.taskId !== task.id;
|
||||||
|
|
||||||
|
const borderColor = isSelected
|
||||||
|
? "border-blue-500 ring-2 ring-blue-200"
|
||||||
|
: isConnectionTarget
|
||||||
|
? "border-purple-400 ring-2 ring-purple-200"
|
||||||
|
: "border-gray-300 hover:border-gray-400";
|
||||||
|
|
||||||
|
const hasAction = task.action && task.action.length > 0;
|
||||||
|
const summary = transitionSummary(task);
|
||||||
|
|
||||||
|
// Count custom transitions (those not matching any preset)
|
||||||
|
const customTransitionCount = (task.next || []).filter((t) => {
|
||||||
|
const ct = classifyTransitionWhen(t.when);
|
||||||
|
return ct === "custom";
|
||||||
|
}).length;
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div
|
||||||
|
ref={nodeRef}
|
||||||
|
className={`absolute select-none ${isDragging ? "cursor-grabbing z-50" : "cursor-grab z-10"}`}
|
||||||
|
style={{
|
||||||
|
left: task.position.x,
|
||||||
|
top: task.position.y,
|
||||||
|
width: 240,
|
||||||
|
}}
|
||||||
|
onMouseDown={handleMouseDown}
|
||||||
|
onClick={handleClick}
|
||||||
|
>
|
||||||
|
{/* Input handle (top center) — drop target */}
|
||||||
|
<div
|
||||||
|
data-handle
|
||||||
|
className="absolute left-1/2 -translate-x-1/2 -top-[7px] z-20"
|
||||||
|
onMouseUp={handleInputHandleMouseUp}
|
||||||
|
onMouseEnter={() => setIsInputHandleHovered(true)}
|
||||||
|
onMouseLeave={() => setIsInputHandleHovered(false)}
|
||||||
|
>
|
||||||
|
<div
|
||||||
|
className="transition-all duration-150 rounded-full border-2 border-white shadow-sm"
|
||||||
|
style={{
|
||||||
|
width:
|
||||||
|
isConnectionTarget && isInputHandleHovered
|
||||||
|
? 16
|
||||||
|
: isConnectionTarget
|
||||||
|
? 14
|
||||||
|
: 10,
|
||||||
|
height:
|
||||||
|
isConnectionTarget && isInputHandleHovered
|
||||||
|
? 16
|
||||||
|
: isConnectionTarget
|
||||||
|
? 14
|
||||||
|
: 10,
|
||||||
|
backgroundColor:
|
||||||
|
isConnectionTarget && isInputHandleHovered
|
||||||
|
? "#8b5cf6"
|
||||||
|
: isConnectionTarget
|
||||||
|
? "#a78bfa"
|
||||||
|
: "#9ca3af",
|
||||||
|
boxShadow:
|
||||||
|
isConnectionTarget && isInputHandleHovered
|
||||||
|
? "0 0 0 4px rgba(139, 92, 246, 0.3), 0 1px 3px rgba(0,0,0,0.2)"
|
||||||
|
: isConnectionTarget
|
||||||
|
? "0 0 0 3px rgba(167, 139, 250, 0.3), 0 1px 2px rgba(0,0,0,0.15)"
|
||||||
|
: "0 1px 2px rgba(0,0,0,0.1)",
|
||||||
|
cursor: isConnectionTarget ? "pointer" : "default",
|
||||||
|
}}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div
|
||||||
|
className={`bg-white rounded-lg border-2 shadow-sm transition-colors ${borderColor}`}
|
||||||
|
>
|
||||||
|
{/* Header */}
|
||||||
|
<div className="flex items-center gap-1.5 px-2.5 py-1.5 rounded-t-md bg-blue-500 bg-opacity-10 border-b border-gray-100">
|
||||||
|
<GripVertical className="w-3.5 h-3.5 text-gray-400 flex-shrink-0" />
|
||||||
|
<div className="flex-1 min-w-0">
|
||||||
|
<div className="font-semibold text-xs text-gray-900 truncate">
|
||||||
|
{task.name}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Body */}
|
||||||
|
<div className="px-2.5 py-2">
|
||||||
|
{hasAction ? (
|
||||||
|
<div className="font-mono text-[11px] text-gray-600 truncate">
|
||||||
|
{task.action}
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<div className="text-[11px] text-orange-500 italic">
|
||||||
|
No action assigned
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Input summary */}
|
||||||
|
{Object.keys(task.input).length > 0 && (
|
||||||
|
<div className="mt-1.5 text-[10px] text-gray-400">
|
||||||
|
{Object.keys(task.input).length} input
|
||||||
|
{Object.keys(task.input).length !== 1 ? "s" : ""}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Transition summary */}
|
||||||
|
{summary && (
|
||||||
|
<div className="mt-1 text-[10px] text-gray-400">{summary}</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Delay badge */}
|
||||||
|
{task.delay && (
|
||||||
|
<div className="mt-1 inline-block px-1.5 py-0.5 bg-yellow-50 border border-yellow-200 rounded text-[10px] text-yellow-700 truncate max-w-full">
|
||||||
|
delay: {task.delay}s
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* With-items badge */}
|
||||||
|
{task.with_items && (
|
||||||
|
<div className="mt-1 inline-block px-1.5 py-0.5 bg-indigo-50 border border-indigo-200 rounded text-[10px] text-indigo-700 truncate max-w-full">
|
||||||
|
with_items
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Retry badge */}
|
||||||
|
{task.retry && (
|
||||||
|
<div className="mt-1 inline-block px-1.5 py-0.5 bg-orange-50 border border-orange-200 rounded text-[10px] text-orange-700 ml-1">
|
||||||
|
retry: {task.retry.count}×
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Custom transitions badge */}
|
||||||
|
{customTransitionCount > 0 && (
|
||||||
|
<div className="mt-1 inline-block px-1.5 py-0.5 bg-violet-50 border border-violet-200 rounded text-[10px] text-violet-700 ml-1">
|
||||||
|
{customTransitionCount} custom transition
|
||||||
|
{customTransitionCount !== 1 ? "s" : ""}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Footer actions */}
|
||||||
|
<div className="flex items-center justify-end px-2 py-1.5 border-t border-gray-100 bg-gray-50 rounded-b-md">
|
||||||
|
<div className="flex gap-1">
|
||||||
|
<button
|
||||||
|
data-action-button
|
||||||
|
onClick={(e) => {
|
||||||
|
e.stopPropagation();
|
||||||
|
onSelect(task.id);
|
||||||
|
}}
|
||||||
|
className="p-1 rounded hover:bg-blue-100 text-gray-400 hover:text-blue-600 transition-colors"
|
||||||
|
title="Configure task"
|
||||||
|
>
|
||||||
|
<Settings className="w-3 h-3" />
|
||||||
|
</button>
|
||||||
|
<button
|
||||||
|
data-action-button
|
||||||
|
onClick={handleDelete}
|
||||||
|
className="p-1 rounded hover:bg-red-100 text-gray-400 hover:text-red-600 transition-colors"
|
||||||
|
title="Delete task"
|
||||||
|
>
|
||||||
|
<Trash2 className="w-3 h-3" />
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Connection target overlay */}
|
||||||
|
{isConnectionTarget && (
|
||||||
|
<div className="absolute inset-0 rounded-lg bg-purple-100 bg-opacity-20 pointer-events-none flex items-center justify-center">
|
||||||
|
<div className="text-xs font-medium text-purple-600 bg-white px-2 py-1 rounded shadow-sm">
|
||||||
|
Drop to connect
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Output handles (bottom) — drag sources */}
|
||||||
|
<div
|
||||||
|
className="flex items-center justify-center gap-3 -mt-[7px] relative z-20"
|
||||||
|
data-handle
|
||||||
|
>
|
||||||
|
{HANDLE_CONFIG.map((handle) => {
|
||||||
|
const isActive = hasActiveTransition(task, handle.preset);
|
||||||
|
const isHovered = hoveredHandle === handle.preset;
|
||||||
|
const isCurrentlyDragging =
|
||||||
|
connectingFrom?.taskId === task.id &&
|
||||||
|
connectingFrom?.preset === handle.preset;
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div
|
||||||
|
key={handle.preset}
|
||||||
|
className="relative group"
|
||||||
|
onMouseEnter={() => setHoveredHandle(handle.preset)}
|
||||||
|
onMouseLeave={() => setHoveredHandle(null)}
|
||||||
|
>
|
||||||
|
<div
|
||||||
|
data-handle
|
||||||
|
onMouseDown={(e) => handleHandleMouseDown(e, handle.preset)}
|
||||||
|
className="transition-all duration-150 rounded-full border-2 border-white cursor-crosshair"
|
||||||
|
style={{
|
||||||
|
width: isHovered || isCurrentlyDragging ? 14 : 10,
|
||||||
|
height: isHovered || isCurrentlyDragging ? 14 : 10,
|
||||||
|
backgroundColor: isCurrentlyDragging
|
||||||
|
? handle.activeColor
|
||||||
|
: isHovered
|
||||||
|
? handle.hoverColor
|
||||||
|
: isActive
|
||||||
|
? handle.color
|
||||||
|
: `${handle.color}80`,
|
||||||
|
boxShadow: isCurrentlyDragging
|
||||||
|
? `0 0 0 4px ${handle.ringColor}, 0 1px 3px rgba(0,0,0,0.2)`
|
||||||
|
: isHovered
|
||||||
|
? `0 0 0 3px ${handle.ringColor}, 0 1px 2px rgba(0,0,0,0.15)`
|
||||||
|
: "0 1px 2px rgba(0,0,0,0.1)",
|
||||||
|
}}
|
||||||
|
/>
|
||||||
|
{/* Tooltip */}
|
||||||
|
<div
|
||||||
|
className={`absolute left-1/2 -translate-x-1/2 top-full mt-1.5 px-2 py-1 bg-gray-900 text-white text-[10px] font-medium rounded shadow-lg whitespace-nowrap pointer-events-none transition-opacity duration-150 ${
|
||||||
|
isHovered ? "opacity-100" : "opacity-0"
|
||||||
|
}`}
|
||||||
|
>
|
||||||
|
{PRESET_LABELS[handle.preset]}
|
||||||
|
<div className="absolute left-1/2 -translate-x-1/2 -top-1 w-2 h-2 bg-gray-900 rotate-45" />
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
})}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
const TaskNode = memo(TaskNodeInner);
|
||||||
|
export default TaskNode;
|
||||||
275
web/src/components/workflows/WorkflowCanvas.tsx
Normal file
275
web/src/components/workflows/WorkflowCanvas.tsx
Normal file
@@ -0,0 +1,275 @@
|
|||||||
|
import { useState, useCallback, useRef, useMemo } from "react";
|
||||||
|
import TaskNode from "./TaskNode";
|
||||||
|
import type { TransitionPreset } from "./TaskNode";
|
||||||
|
import WorkflowEdges from "./WorkflowEdges";
|
||||||
|
import type { EdgeHoverInfo } from "./WorkflowEdges";
|
||||||
|
import type {
|
||||||
|
WorkflowTask,
|
||||||
|
PaletteAction,
|
||||||
|
WorkflowEdge,
|
||||||
|
} from "@/types/workflow";
|
||||||
|
import {
|
||||||
|
deriveEdges,
|
||||||
|
generateUniqueTaskName,
|
||||||
|
generateTaskId,
|
||||||
|
PRESET_LABELS,
|
||||||
|
} from "@/types/workflow";
|
||||||
|
import { Plus } from "lucide-react";
|
||||||
|
|
||||||
|
interface WorkflowCanvasProps {
|
||||||
|
tasks: WorkflowTask[];
|
||||||
|
selectedTaskId: string | null;
|
||||||
|
availableActions: PaletteAction[];
|
||||||
|
onSelectTask: (taskId: string | null) => void;
|
||||||
|
onUpdateTask: (taskId: string, updates: Partial<WorkflowTask>) => void;
|
||||||
|
onDeleteTask: (taskId: string) => void;
|
||||||
|
onAddTask: (task: WorkflowTask) => void;
|
||||||
|
onSetConnection: (
|
||||||
|
fromTaskId: string,
|
||||||
|
preset: TransitionPreset,
|
||||||
|
toTaskName: string,
|
||||||
|
) => void;
|
||||||
|
onEdgeHover?: (info: EdgeHoverInfo | null) => void;
|
||||||
|
}
|
||||||
|
|
||||||
|
/** Label color mapping for the connecting banner */
|
||||||
|
const PRESET_BANNER_COLORS: Record<TransitionPreset, string> = {
|
||||||
|
succeeded: "text-green-200 font-bold",
|
||||||
|
failed: "text-red-200 font-bold",
|
||||||
|
always: "text-gray-200 font-bold",
|
||||||
|
};
|
||||||
|
|
||||||
|
export default function WorkflowCanvas({
|
||||||
|
tasks,
|
||||||
|
selectedTaskId,
|
||||||
|
onSelectTask,
|
||||||
|
onUpdateTask,
|
||||||
|
onDeleteTask,
|
||||||
|
onAddTask,
|
||||||
|
onSetConnection,
|
||||||
|
onEdgeHover,
|
||||||
|
}: WorkflowCanvasProps) {
|
||||||
|
const canvasRef = useRef<HTMLDivElement>(null);
|
||||||
|
const [connectingFrom, setConnectingFrom] = useState<{
|
||||||
|
taskId: string;
|
||||||
|
preset: TransitionPreset;
|
||||||
|
} | null>(null);
|
||||||
|
const [mousePosition, setMousePosition] = useState<{
|
||||||
|
x: number;
|
||||||
|
y: number;
|
||||||
|
} | null>(null);
|
||||||
|
|
||||||
|
const allTaskNames = useMemo(() => tasks.map((t) => t.name), [tasks]);
|
||||||
|
|
||||||
|
const edges: WorkflowEdge[] = useMemo(() => deriveEdges(tasks), [tasks]);
|
||||||
|
|
||||||
|
const handleCanvasClick = useCallback(
|
||||||
|
(e: React.MouseEvent) => {
|
||||||
|
// Only deselect if clicking the canvas background
|
||||||
|
if (
|
||||||
|
e.target === canvasRef.current ||
|
||||||
|
(e.target as HTMLElement).dataset.canvasBg === "true"
|
||||||
|
) {
|
||||||
|
if (connectingFrom) {
|
||||||
|
setConnectingFrom(null);
|
||||||
|
setMousePosition(null);
|
||||||
|
} else {
|
||||||
|
onSelectTask(null);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
[onSelectTask, connectingFrom],
|
||||||
|
);
|
||||||
|
|
||||||
|
const handleCanvasMouseMove = useCallback(
|
||||||
|
(e: React.MouseEvent) => {
|
||||||
|
if (connectingFrom && canvasRef.current) {
|
||||||
|
const rect = canvasRef.current.getBoundingClientRect();
|
||||||
|
const scrollLeft = canvasRef.current.scrollLeft;
|
||||||
|
const scrollTop = canvasRef.current.scrollTop;
|
||||||
|
setMousePosition({
|
||||||
|
x: e.clientX - rect.left + scrollLeft,
|
||||||
|
y: e.clientY - rect.top + scrollTop,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
},
|
||||||
|
[connectingFrom],
|
||||||
|
);
|
||||||
|
|
||||||
|
const handleCanvasMouseUp = useCallback(() => {
|
||||||
|
// If we're connecting and mouseup happens on the canvas (not on a node),
|
||||||
|
// cancel the connection
|
||||||
|
if (connectingFrom) {
|
||||||
|
setConnectingFrom(null);
|
||||||
|
setMousePosition(null);
|
||||||
|
}
|
||||||
|
}, [connectingFrom]);
|
||||||
|
|
||||||
|
const handlePositionChange = useCallback(
|
||||||
|
(taskId: string, position: { x: number; y: number }) => {
|
||||||
|
onUpdateTask(taskId, { position });
|
||||||
|
},
|
||||||
|
[onUpdateTask],
|
||||||
|
);
|
||||||
|
|
||||||
|
const handleStartConnection = useCallback(
|
||||||
|
(taskId: string, preset: TransitionPreset) => {
|
||||||
|
setConnectingFrom({ taskId, preset });
|
||||||
|
},
|
||||||
|
[],
|
||||||
|
);
|
||||||
|
|
||||||
|
const handleCompleteConnection = useCallback(
|
||||||
|
(targetTaskId: string) => {
|
||||||
|
if (!connectingFrom) return;
|
||||||
|
|
||||||
|
const targetTask = tasks.find((t) => t.id === targetTaskId);
|
||||||
|
if (!targetTask) return;
|
||||||
|
|
||||||
|
onSetConnection(
|
||||||
|
connectingFrom.taskId,
|
||||||
|
connectingFrom.preset,
|
||||||
|
targetTask.name,
|
||||||
|
);
|
||||||
|
setConnectingFrom(null);
|
||||||
|
setMousePosition(null);
|
||||||
|
},
|
||||||
|
[connectingFrom, tasks, onSetConnection],
|
||||||
|
);
|
||||||
|
|
||||||
|
const handleAddEmptyTask = useCallback(() => {
|
||||||
|
const name = generateUniqueTaskName(tasks);
|
||||||
|
// Position new tasks below existing ones
|
||||||
|
let maxY = 0;
|
||||||
|
for (const task of tasks) {
|
||||||
|
if (task.position.y > maxY) {
|
||||||
|
maxY = task.position.y;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
const newTask: WorkflowTask = {
|
||||||
|
id: generateTaskId(),
|
||||||
|
name,
|
||||||
|
action: "",
|
||||||
|
input: {},
|
||||||
|
position: {
|
||||||
|
x: 300,
|
||||||
|
y: tasks.length === 0 ? 60 : maxY + 160,
|
||||||
|
},
|
||||||
|
};
|
||||||
|
onAddTask(newTask);
|
||||||
|
onSelectTask(newTask.id);
|
||||||
|
}, [tasks, onAddTask, onSelectTask]);
|
||||||
|
|
||||||
|
// Calculate minimum canvas dimensions based on node positions
|
||||||
|
const canvasDimensions = useMemo(() => {
|
||||||
|
let maxX = 800;
|
||||||
|
let maxY = 600;
|
||||||
|
for (const task of tasks) {
|
||||||
|
maxX = Math.max(maxX, task.position.x + 340);
|
||||||
|
maxY = Math.max(maxY, task.position.y + 220);
|
||||||
|
}
|
||||||
|
return { width: maxX, height: maxY };
|
||||||
|
}, [tasks]);
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div
|
||||||
|
className="flex-1 overflow-auto bg-gray-100 relative"
|
||||||
|
ref={canvasRef}
|
||||||
|
onClick={handleCanvasClick}
|
||||||
|
onMouseMove={handleCanvasMouseMove}
|
||||||
|
onMouseUp={handleCanvasMouseUp}
|
||||||
|
>
|
||||||
|
{/* Grid background */}
|
||||||
|
<div
|
||||||
|
data-canvas-bg="true"
|
||||||
|
className="absolute inset-0"
|
||||||
|
style={{
|
||||||
|
minWidth: canvasDimensions.width,
|
||||||
|
minHeight: canvasDimensions.height,
|
||||||
|
backgroundImage: `
|
||||||
|
linear-gradient(to right, rgba(0,0,0,0.03) 1px, transparent 1px),
|
||||||
|
linear-gradient(to bottom, rgba(0,0,0,0.03) 1px, transparent 1px)
|
||||||
|
`,
|
||||||
|
backgroundSize: "20px 20px",
|
||||||
|
}}
|
||||||
|
/>
|
||||||
|
|
||||||
|
{/* Connecting mode indicator */}
|
||||||
|
{connectingFrom && (
|
||||||
|
<div className="sticky top-0 left-0 right-0 z-50 flex justify-center pointer-events-none">
|
||||||
|
<div className="mt-3 px-4 py-2 bg-purple-600 text-white text-sm font-medium rounded-full shadow-lg pointer-events-auto">
|
||||||
|
Drag to a task to connect as{" "}
|
||||||
|
<span className={PRESET_BANNER_COLORS[connectingFrom.preset]}>
|
||||||
|
{PRESET_LABELS[connectingFrom.preset]}
|
||||||
|
</span>{" "}
|
||||||
|
transition — or release to cancel
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Edge rendering layer */}
|
||||||
|
<WorkflowEdges
|
||||||
|
edges={edges}
|
||||||
|
tasks={tasks}
|
||||||
|
connectingFrom={connectingFrom}
|
||||||
|
mousePosition={mousePosition}
|
||||||
|
onEdgeHover={onEdgeHover}
|
||||||
|
/>
|
||||||
|
|
||||||
|
{/* Task nodes */}
|
||||||
|
{tasks.map((task) => (
|
||||||
|
<TaskNode
|
||||||
|
key={task.id}
|
||||||
|
task={task}
|
||||||
|
isSelected={task.id === selectedTaskId}
|
||||||
|
allTaskNames={allTaskNames}
|
||||||
|
onSelect={onSelectTask}
|
||||||
|
onDelete={onDeleteTask}
|
||||||
|
onPositionChange={handlePositionChange}
|
||||||
|
onStartConnection={handleStartConnection}
|
||||||
|
connectingFrom={connectingFrom}
|
||||||
|
onCompleteConnection={handleCompleteConnection}
|
||||||
|
/>
|
||||||
|
))}
|
||||||
|
|
||||||
|
{/* Empty state / Add task button */}
|
||||||
|
{tasks.length === 0 ? (
|
||||||
|
<div
|
||||||
|
className="absolute inset-0 flex items-center justify-center pointer-events-none"
|
||||||
|
style={{
|
||||||
|
minWidth: canvasDimensions.width,
|
||||||
|
minHeight: canvasDimensions.height,
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
<div className="text-center pointer-events-auto">
|
||||||
|
<div className="w-16 h-16 mx-auto mb-4 rounded-full bg-gray-200 flex items-center justify-center">
|
||||||
|
<Plus className="w-8 h-8 text-gray-400" />
|
||||||
|
</div>
|
||||||
|
<h3 className="text-lg font-medium text-gray-600 mb-2">
|
||||||
|
Empty Workflow
|
||||||
|
</h3>
|
||||||
|
<p className="text-sm text-gray-400 mb-4 max-w-xs">
|
||||||
|
Add tasks from the action palette on the left, or click the button
|
||||||
|
below to add a blank task.
|
||||||
|
</p>
|
||||||
|
<button
|
||||||
|
onClick={handleAddEmptyTask}
|
||||||
|
className="px-4 py-2 bg-blue-600 text-white text-sm font-medium rounded-lg hover:bg-blue-700 transition-colors shadow-sm"
|
||||||
|
>
|
||||||
|
<Plus className="w-4 h-4 inline-block mr-1.5 -mt-0.5" />
|
||||||
|
Add First Task
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<button
|
||||||
|
onClick={handleAddEmptyTask}
|
||||||
|
className="fixed bottom-6 right-6 z-40 w-12 h-12 bg-blue-600 text-white rounded-full shadow-lg hover:bg-blue-700 transition-colors flex items-center justify-center"
|
||||||
|
title="Add a new task"
|
||||||
|
>
|
||||||
|
<Plus className="w-6 h-6" />
|
||||||
|
</button>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
379
web/src/components/workflows/WorkflowEdges.tsx
Normal file
379
web/src/components/workflows/WorkflowEdges.tsx
Normal file
@@ -0,0 +1,379 @@
|
|||||||
|
import { memo, useMemo } from "react";
|
||||||
|
import type { WorkflowEdge, WorkflowTask, EdgeType } from "@/types/workflow";
|
||||||
|
import type { TransitionPreset } from "./TaskNode";
|
||||||
|
|
||||||
|
export interface EdgeHoverInfo {
|
||||||
|
taskId: string;
|
||||||
|
transitionIndex: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface WorkflowEdgesProps {
|
||||||
|
edges: WorkflowEdge[];
|
||||||
|
tasks: WorkflowTask[];
|
||||||
|
/** Width of each task node (must match TaskNode width) */
|
||||||
|
nodeWidth?: number;
|
||||||
|
/** Approximate height of each task node */
|
||||||
|
nodeHeight?: number;
|
||||||
|
/** The task ID currently being connected from (for preview line) */
|
||||||
|
connectingFrom?: { taskId: string; preset: TransitionPreset } | null;
|
||||||
|
/** Mouse position for drawing the preview connection line */
|
||||||
|
mousePosition?: { x: number; y: number } | null;
|
||||||
|
/** Called when the mouse enters/leaves an edge hit area */
|
||||||
|
onEdgeHover?: (info: EdgeHoverInfo | null) => void;
|
||||||
|
}
|
||||||
|
|
||||||
|
const NODE_WIDTH = 240;
|
||||||
|
const NODE_HEIGHT = 120;
|
||||||
|
|
||||||
|
/** Color for each edge type */
|
||||||
|
const EDGE_COLORS: Record<EdgeType, string> = {
|
||||||
|
success: "#22c55e", // green-500
|
||||||
|
failure: "#ef4444", // red-500
|
||||||
|
complete: "#6b7280", // gray-500 (unconditional / always)
|
||||||
|
custom: "#8b5cf6", // violet-500
|
||||||
|
};
|
||||||
|
|
||||||
|
const EDGE_DASH: Record<EdgeType, string> = {
|
||||||
|
success: "",
|
||||||
|
failure: "6,4",
|
||||||
|
complete: "4,4",
|
||||||
|
custom: "8,4,2,4",
|
||||||
|
};
|
||||||
|
|
||||||
|
/** Map presets to edge colors for the preview line */
|
||||||
|
const PRESET_COLORS: Record<TransitionPreset, string> = {
|
||||||
|
succeeded: EDGE_COLORS.success,
|
||||||
|
failed: EDGE_COLORS.failure,
|
||||||
|
always: EDGE_COLORS.complete,
|
||||||
|
};
|
||||||
|
|
||||||
|
/** Calculate the center-bottom of a task node */
|
||||||
|
function getNodeBottomCenter(
|
||||||
|
task: WorkflowTask,
|
||||||
|
nodeWidth: number,
|
||||||
|
nodeHeight: number,
|
||||||
|
) {
|
||||||
|
return {
|
||||||
|
x: task.position.x + nodeWidth / 2,
|
||||||
|
y: task.position.y + nodeHeight,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/** Calculate the center-top of a task node */
|
||||||
|
function getNodeTopCenter(task: WorkflowTask, nodeWidth: number) {
|
||||||
|
return {
|
||||||
|
x: task.position.x + nodeWidth / 2,
|
||||||
|
y: task.position.y,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/** Calculate the left-center of a task node */
|
||||||
|
function getNodeLeftCenter(task: WorkflowTask, nodeHeight: number) {
|
||||||
|
return {
|
||||||
|
x: task.position.x,
|
||||||
|
y: task.position.y + nodeHeight / 2,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/** Calculate the right-center of a task node */
|
||||||
|
function getNodeRightCenter(
|
||||||
|
task: WorkflowTask,
|
||||||
|
nodeWidth: number,
|
||||||
|
nodeHeight: number,
|
||||||
|
) {
|
||||||
|
return {
|
||||||
|
x: task.position.x + nodeWidth,
|
||||||
|
y: task.position.y + nodeHeight / 2,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Determine the best connection points between two nodes.
|
||||||
|
* Returns the start and end points for the edge.
|
||||||
|
*/
|
||||||
|
function getBestConnectionPoints(
|
||||||
|
fromTask: WorkflowTask,
|
||||||
|
toTask: WorkflowTask,
|
||||||
|
nodeWidth: number,
|
||||||
|
nodeHeight: number,
|
||||||
|
): { start: { x: number; y: number }; end: { x: number; y: number } } {
|
||||||
|
const fromCenter = {
|
||||||
|
x: fromTask.position.x + nodeWidth / 2,
|
||||||
|
y: fromTask.position.y + nodeHeight / 2,
|
||||||
|
};
|
||||||
|
const toCenter = {
|
||||||
|
x: toTask.position.x + nodeWidth / 2,
|
||||||
|
y: toTask.position.y + nodeHeight / 2,
|
||||||
|
};
|
||||||
|
|
||||||
|
const dx = toCenter.x - fromCenter.x;
|
||||||
|
const dy = toCenter.y - fromCenter.y;
|
||||||
|
|
||||||
|
// If the target is mostly below the source, use bottom→top
|
||||||
|
if (dy > 0 && Math.abs(dy) > Math.abs(dx) * 0.5) {
|
||||||
|
return {
|
||||||
|
start: getNodeBottomCenter(fromTask, nodeWidth, nodeHeight),
|
||||||
|
end: getNodeTopCenter(toTask, nodeWidth),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// If the target is mostly above the source, use top→bottom
|
||||||
|
if (dy < 0 && Math.abs(dy) > Math.abs(dx) * 0.5) {
|
||||||
|
return {
|
||||||
|
start: getNodeTopCenter(fromTask, nodeWidth),
|
||||||
|
end: getNodeBottomCenter(toTask, nodeWidth, nodeHeight),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// If the target is to the right, use right→left
|
||||||
|
if (dx > 0) {
|
||||||
|
return {
|
||||||
|
start: getNodeRightCenter(fromTask, nodeWidth, nodeHeight),
|
||||||
|
end: getNodeLeftCenter(toTask, nodeHeight),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Target is to the left, use left→right
|
||||||
|
return {
|
||||||
|
start: getNodeLeftCenter(fromTask, nodeHeight),
|
||||||
|
end: getNodeRightCenter(toTask, nodeWidth, nodeHeight),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Build an SVG path string for a curved edge between two points.
|
||||||
|
* Uses a cubic bezier curve.
|
||||||
|
*/
|
||||||
|
function buildCurvePath(
|
||||||
|
start: { x: number; y: number },
|
||||||
|
end: { x: number; y: number },
|
||||||
|
): string {
|
||||||
|
const dx = end.x - start.x;
|
||||||
|
const dy = end.y - start.y;
|
||||||
|
|
||||||
|
// Determine control points based on dominant direction
|
||||||
|
let cp1: { x: number; y: number };
|
||||||
|
let cp2: { x: number; y: number };
|
||||||
|
|
||||||
|
if (Math.abs(dy) > Math.abs(dx) * 0.5) {
|
||||||
|
// Mostly vertical connection
|
||||||
|
const offset = Math.min(Math.abs(dy) * 0.5, 80);
|
||||||
|
const direction = dy > 0 ? 1 : -1;
|
||||||
|
cp1 = { x: start.x, y: start.y + offset * direction };
|
||||||
|
cp2 = { x: end.x, y: end.y - offset * direction };
|
||||||
|
} else {
|
||||||
|
// Mostly horizontal connection
|
||||||
|
const offset = Math.min(Math.abs(dx) * 0.5, 80);
|
||||||
|
const direction = dx > 0 ? 1 : -1;
|
||||||
|
cp1 = { x: start.x + offset * direction, y: start.y };
|
||||||
|
cp2 = { x: end.x - offset * direction, y: end.y };
|
||||||
|
}
|
||||||
|
|
||||||
|
return `M ${start.x} ${start.y} C ${cp1.x} ${cp1.y}, ${cp2.x} ${cp2.y}, ${end.x} ${end.y}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
function WorkflowEdgesInner({
|
||||||
|
edges,
|
||||||
|
tasks,
|
||||||
|
nodeWidth = NODE_WIDTH,
|
||||||
|
nodeHeight = NODE_HEIGHT,
|
||||||
|
connectingFrom,
|
||||||
|
mousePosition,
|
||||||
|
onEdgeHover,
|
||||||
|
}: WorkflowEdgesProps) {
|
||||||
|
const taskMap = useMemo(() => {
|
||||||
|
const map = new Map<string, WorkflowTask>();
|
||||||
|
for (const task of tasks) {
|
||||||
|
map.set(task.id, task);
|
||||||
|
}
|
||||||
|
return map;
|
||||||
|
}, [tasks]);
|
||||||
|
|
||||||
|
// Calculate SVG bounds to cover all nodes + padding
|
||||||
|
const svgBounds = useMemo(() => {
|
||||||
|
if (tasks.length === 0) return { width: 2000, height: 2000 };
|
||||||
|
let maxX = 0;
|
||||||
|
let maxY = 0;
|
||||||
|
for (const task of tasks) {
|
||||||
|
maxX = Math.max(maxX, task.position.x + nodeWidth + 100);
|
||||||
|
maxY = Math.max(maxY, task.position.y + nodeHeight + 100);
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
width: Math.max(maxX, 2000),
|
||||||
|
height: Math.max(maxY, 2000),
|
||||||
|
};
|
||||||
|
}, [tasks, nodeWidth, nodeHeight]);
|
||||||
|
|
||||||
|
const renderedEdges = useMemo(() => {
|
||||||
|
return edges
|
||||||
|
.map((edge, index) => {
|
||||||
|
const fromTask = taskMap.get(edge.from);
|
||||||
|
const toTask = taskMap.get(edge.to);
|
||||||
|
if (!fromTask || !toTask) return null;
|
||||||
|
|
||||||
|
const { start, end } = getBestConnectionPoints(
|
||||||
|
fromTask,
|
||||||
|
toTask,
|
||||||
|
nodeWidth,
|
||||||
|
nodeHeight,
|
||||||
|
);
|
||||||
|
|
||||||
|
const pathD = buildCurvePath(start, end);
|
||||||
|
const color =
|
||||||
|
edge.color || EDGE_COLORS[edge.type] || EDGE_COLORS.complete;
|
||||||
|
const dash = EDGE_DASH[edge.type] || "";
|
||||||
|
|
||||||
|
// Calculate label position (midpoint of curve)
|
||||||
|
const labelX = (start.x + end.x) / 2;
|
||||||
|
const labelY = (start.y + end.y) / 2 - 8;
|
||||||
|
|
||||||
|
// Measure approximate label width
|
||||||
|
const labelText = edge.label || "";
|
||||||
|
const labelWidth = Math.max(labelText.length * 5.5 + 12, 48);
|
||||||
|
const arrowId = edge.color
|
||||||
|
? `arrow-custom-${index}`
|
||||||
|
: `arrow-${edge.type}`;
|
||||||
|
|
||||||
|
return (
|
||||||
|
<g key={`edge-${index}-${edge.from}-${edge.to}`}>
|
||||||
|
{/* Edge path */}
|
||||||
|
<path
|
||||||
|
d={pathD}
|
||||||
|
fill="none"
|
||||||
|
stroke={color}
|
||||||
|
strokeWidth={2}
|
||||||
|
strokeDasharray={dash}
|
||||||
|
markerEnd={`url(#${arrowId})`}
|
||||||
|
className="transition-opacity"
|
||||||
|
opacity={0.75}
|
||||||
|
/>
|
||||||
|
{/* Wider invisible path for easier hovering */}
|
||||||
|
<path
|
||||||
|
d={pathD}
|
||||||
|
fill="none"
|
||||||
|
stroke="transparent"
|
||||||
|
strokeWidth={12}
|
||||||
|
className="cursor-pointer"
|
||||||
|
onMouseEnter={() =>
|
||||||
|
onEdgeHover?.({
|
||||||
|
taskId: edge.from,
|
||||||
|
transitionIndex: edge.transitionIndex,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
onMouseLeave={() => onEdgeHover?.(null)}
|
||||||
|
/>
|
||||||
|
{/* Label */}
|
||||||
|
{edge.label && (
|
||||||
|
<g>
|
||||||
|
<rect
|
||||||
|
x={labelX - labelWidth / 2}
|
||||||
|
y={labelY - 7}
|
||||||
|
width={labelWidth}
|
||||||
|
height={14}
|
||||||
|
rx={3}
|
||||||
|
fill="white"
|
||||||
|
stroke={color}
|
||||||
|
strokeWidth={0.5}
|
||||||
|
opacity={0.9}
|
||||||
|
/>
|
||||||
|
<text
|
||||||
|
x={labelX}
|
||||||
|
y={labelY + 3}
|
||||||
|
textAnchor="middle"
|
||||||
|
fontSize={9}
|
||||||
|
fontWeight={500}
|
||||||
|
fill={color}
|
||||||
|
className="select-none pointer-events-none"
|
||||||
|
>
|
||||||
|
{labelText.length > 24
|
||||||
|
? labelText.slice(0, 21) + "..."
|
||||||
|
: labelText}
|
||||||
|
</text>
|
||||||
|
</g>
|
||||||
|
)}
|
||||||
|
</g>
|
||||||
|
);
|
||||||
|
})
|
||||||
|
.filter(Boolean);
|
||||||
|
}, [edges, taskMap, nodeWidth, nodeHeight, onEdgeHover]);
|
||||||
|
|
||||||
|
// Preview line when connecting
|
||||||
|
const previewLine = useMemo(() => {
|
||||||
|
if (!connectingFrom || !mousePosition) return null;
|
||||||
|
const fromTask = taskMap.get(connectingFrom.taskId);
|
||||||
|
if (!fromTask) return null;
|
||||||
|
|
||||||
|
const start = getNodeBottomCenter(fromTask, nodeWidth, nodeHeight);
|
||||||
|
const end = mousePosition;
|
||||||
|
const pathD = buildCurvePath(start, end);
|
||||||
|
const color = PRESET_COLORS[connectingFrom.preset] || EDGE_COLORS.complete;
|
||||||
|
|
||||||
|
return (
|
||||||
|
<path
|
||||||
|
d={pathD}
|
||||||
|
fill="none"
|
||||||
|
stroke={color}
|
||||||
|
strokeWidth={2}
|
||||||
|
strokeDasharray="6,4"
|
||||||
|
opacity={0.5}
|
||||||
|
className="pointer-events-none"
|
||||||
|
/>
|
||||||
|
);
|
||||||
|
}, [connectingFrom, mousePosition, taskMap, nodeWidth, nodeHeight]);
|
||||||
|
|
||||||
|
return (
|
||||||
|
<svg
|
||||||
|
className="absolute inset-0 pointer-events-none overflow-visible"
|
||||||
|
width={svgBounds.width}
|
||||||
|
height={svgBounds.height}
|
||||||
|
style={{ zIndex: 1 }}
|
||||||
|
>
|
||||||
|
<defs>
|
||||||
|
{/* Arrow markers for each edge type */}
|
||||||
|
{Object.entries(EDGE_COLORS).map(([type, color]) => (
|
||||||
|
<marker
|
||||||
|
key={`arrow-${type}`}
|
||||||
|
id={`arrow-${type}`}
|
||||||
|
viewBox="0 0 10 10"
|
||||||
|
refX={9}
|
||||||
|
refY={5}
|
||||||
|
markerWidth={8}
|
||||||
|
markerHeight={8}
|
||||||
|
orient="auto-start-reverse"
|
||||||
|
>
|
||||||
|
<path d="M 0 0 L 10 5 L 0 10 z" fill={color} opacity={0.8} />
|
||||||
|
</marker>
|
||||||
|
))}
|
||||||
|
</defs>
|
||||||
|
|
||||||
|
{/* Render edges */}
|
||||||
|
<g className="pointer-events-auto">
|
||||||
|
{/* Dynamic arrow markers for custom-colored edges */}
|
||||||
|
{edges.map((edge, index) => {
|
||||||
|
if (!edge.color) return null;
|
||||||
|
return (
|
||||||
|
<marker
|
||||||
|
key={`arrow-custom-${index}`}
|
||||||
|
id={`arrow-custom-${index}`}
|
||||||
|
viewBox="0 0 10 10"
|
||||||
|
refX={9}
|
||||||
|
refY={5}
|
||||||
|
markerWidth={8}
|
||||||
|
markerHeight={8}
|
||||||
|
orient="auto-start-reverse"
|
||||||
|
>
|
||||||
|
<path d="M 0 0 L 10 5 L 0 10 z" fill={edge.color} opacity={0.8} />
|
||||||
|
</marker>
|
||||||
|
);
|
||||||
|
})}
|
||||||
|
{renderedEdges}
|
||||||
|
</g>
|
||||||
|
|
||||||
|
{/* Preview line */}
|
||||||
|
{previewLine}
|
||||||
|
</svg>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
const WorkflowEdges = memo(WorkflowEdgesInner);
|
||||||
|
export default WorkflowEdges;
|
||||||
@@ -96,13 +96,11 @@ export function useInstallPack() {
|
|||||||
mutationFn: async ({
|
mutationFn: async ({
|
||||||
source,
|
source,
|
||||||
refSpec,
|
refSpec,
|
||||||
force = false,
|
|
||||||
skipTests = false,
|
skipTests = false,
|
||||||
skipDeps = false,
|
skipDeps = false,
|
||||||
}: {
|
}: {
|
||||||
source: string;
|
source: string;
|
||||||
refSpec?: string;
|
refSpec?: string;
|
||||||
force?: boolean;
|
|
||||||
skipTests?: boolean;
|
skipTests?: boolean;
|
||||||
skipDeps?: boolean;
|
skipDeps?: boolean;
|
||||||
}) => {
|
}) => {
|
||||||
@@ -110,7 +108,6 @@ export function useInstallPack() {
|
|||||||
requestBody: {
|
requestBody: {
|
||||||
source,
|
source,
|
||||||
ref_spec: refSpec,
|
ref_spec: refSpec,
|
||||||
force,
|
|
||||||
skip_tests: skipTests,
|
skip_tests: skipTests,
|
||||||
skip_deps: skipDeps,
|
skip_deps: skipDeps,
|
||||||
},
|
},
|
||||||
|
|||||||
163
web/src/hooks/useWorkflows.ts
Normal file
163
web/src/hooks/useWorkflows.ts
Normal file
@@ -0,0 +1,163 @@
|
|||||||
|
import { useQuery, useMutation, useQueryClient } from "@tanstack/react-query";
|
||||||
|
import { WorkflowsService } from "@/api";
|
||||||
|
import type { CreateWorkflowRequest, UpdateWorkflowRequest } from "@/api";
|
||||||
|
import type { SaveWorkflowFileRequest } from "@/types/workflow";
|
||||||
|
import { OpenAPI } from "@/api/core/OpenAPI";
|
||||||
|
import { request as __request } from "@/api/core/request";
|
||||||
|
|
||||||
|
interface WorkflowsQueryParams {
|
||||||
|
page?: number;
|
||||||
|
pageSize?: number;
|
||||||
|
packRef?: string;
|
||||||
|
tags?: string;
|
||||||
|
enabled?: boolean;
|
||||||
|
search?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Fetch all workflows with pagination and filtering
|
||||||
|
export function useWorkflows(params?: WorkflowsQueryParams) {
|
||||||
|
return useQuery({
|
||||||
|
queryKey: ["workflows", params],
|
||||||
|
queryFn: async () => {
|
||||||
|
const response = await WorkflowsService.listWorkflows({
|
||||||
|
page: params?.page || 1,
|
||||||
|
pageSize: params?.pageSize || 50,
|
||||||
|
tags: params?.tags,
|
||||||
|
enabled: params?.enabled,
|
||||||
|
search: params?.search,
|
||||||
|
packRef: params?.packRef,
|
||||||
|
});
|
||||||
|
return response;
|
||||||
|
},
|
||||||
|
staleTime: 30000,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Fetch single workflow by ref
|
||||||
|
export function useWorkflow(ref: string) {
|
||||||
|
return useQuery({
|
||||||
|
queryKey: ["workflows", ref],
|
||||||
|
queryFn: async () => {
|
||||||
|
const response = await WorkflowsService.getWorkflow({ ref });
|
||||||
|
return response;
|
||||||
|
},
|
||||||
|
enabled: !!ref,
|
||||||
|
staleTime: 30000,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create a new workflow
|
||||||
|
export function useCreateWorkflow() {
|
||||||
|
const queryClient = useQueryClient();
|
||||||
|
|
||||||
|
return useMutation({
|
||||||
|
mutationFn: async (data: CreateWorkflowRequest) => {
|
||||||
|
const response = await WorkflowsService.createWorkflow({
|
||||||
|
requestBody: data,
|
||||||
|
});
|
||||||
|
return response;
|
||||||
|
},
|
||||||
|
onSuccess: () => {
|
||||||
|
queryClient.invalidateQueries({ queryKey: ["workflows"] });
|
||||||
|
},
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Update existing workflow
|
||||||
|
export function useUpdateWorkflow() {
|
||||||
|
const queryClient = useQueryClient();
|
||||||
|
|
||||||
|
return useMutation({
|
||||||
|
mutationFn: async ({
|
||||||
|
ref,
|
||||||
|
data,
|
||||||
|
}: {
|
||||||
|
ref: string;
|
||||||
|
data: UpdateWorkflowRequest;
|
||||||
|
}) => {
|
||||||
|
const response = await WorkflowsService.updateWorkflow({
|
||||||
|
ref,
|
||||||
|
requestBody: data,
|
||||||
|
});
|
||||||
|
return response;
|
||||||
|
},
|
||||||
|
onSuccess: (_, variables) => {
|
||||||
|
queryClient.invalidateQueries({ queryKey: ["workflows"] });
|
||||||
|
queryClient.invalidateQueries({
|
||||||
|
queryKey: ["workflows", variables.ref],
|
||||||
|
});
|
||||||
|
},
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Delete workflow
|
||||||
|
export function useDeleteWorkflow() {
|
||||||
|
const queryClient = useQueryClient();
|
||||||
|
|
||||||
|
return useMutation({
|
||||||
|
mutationFn: async (ref: string) => {
|
||||||
|
await WorkflowsService.deleteWorkflow({ ref });
|
||||||
|
},
|
||||||
|
onSuccess: () => {
|
||||||
|
queryClient.invalidateQueries({ queryKey: ["workflows"] });
|
||||||
|
},
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Save workflow file to disk and sync to DB
|
||||||
|
// This calls a custom endpoint not in the generated client
|
||||||
|
export function useSaveWorkflowFile() {
|
||||||
|
const queryClient = useQueryClient();
|
||||||
|
|
||||||
|
return useMutation({
|
||||||
|
mutationFn: async (data: SaveWorkflowFileRequest) => {
|
||||||
|
const response = await __request(OpenAPI, {
|
||||||
|
method: "POST",
|
||||||
|
url: "/api/v1/packs/{pack_ref}/workflow-files",
|
||||||
|
path: {
|
||||||
|
pack_ref: data.pack_ref,
|
||||||
|
},
|
||||||
|
body: data,
|
||||||
|
mediaType: "application/json",
|
||||||
|
});
|
||||||
|
return response;
|
||||||
|
},
|
||||||
|
onSuccess: () => {
|
||||||
|
queryClient.invalidateQueries({ queryKey: ["workflows"] });
|
||||||
|
queryClient.invalidateQueries({ queryKey: ["actions"] });
|
||||||
|
},
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Update an existing workflow file on disk and sync to DB
|
||||||
|
export function useUpdateWorkflowFile() {
|
||||||
|
const queryClient = useQueryClient();
|
||||||
|
|
||||||
|
return useMutation({
|
||||||
|
mutationFn: async ({
|
||||||
|
workflowRef,
|
||||||
|
data,
|
||||||
|
}: {
|
||||||
|
workflowRef: string;
|
||||||
|
data: SaveWorkflowFileRequest;
|
||||||
|
}) => {
|
||||||
|
const response = await __request(OpenAPI, {
|
||||||
|
method: "PUT",
|
||||||
|
url: "/api/v1/workflows/{ref}/file",
|
||||||
|
path: {
|
||||||
|
ref: workflowRef,
|
||||||
|
},
|
||||||
|
body: data,
|
||||||
|
mediaType: "application/json",
|
||||||
|
});
|
||||||
|
return response;
|
||||||
|
},
|
||||||
|
onSuccess: (_, variables) => {
|
||||||
|
queryClient.invalidateQueries({ queryKey: ["workflows"] });
|
||||||
|
queryClient.invalidateQueries({
|
||||||
|
queryKey: ["workflows", variables.workflowRef],
|
||||||
|
});
|
||||||
|
queryClient.invalidateQueries({ queryKey: ["actions"] });
|
||||||
|
},
|
||||||
|
});
|
||||||
|
}
|
||||||
@@ -2,6 +2,21 @@
|
|||||||
@tailwind components;
|
@tailwind components;
|
||||||
@tailwind utilities;
|
@tailwind utilities;
|
||||||
|
|
||||||
|
@keyframes flash-highlight {
|
||||||
|
0% {
|
||||||
|
background-color: rgb(191 219 254); /* blue-200 */
|
||||||
|
box-shadow: 0 0 12px 2px rgb(147 197 253 / 0.5); /* blue-300 glow */
|
||||||
|
}
|
||||||
|
40% {
|
||||||
|
background-color: rgb(219 234 254); /* blue-100 */
|
||||||
|
box-shadow: 0 0 6px 1px rgb(147 197 253 / 0.3);
|
||||||
|
}
|
||||||
|
100% {
|
||||||
|
background-color: transparent;
|
||||||
|
box-shadow: none;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
@layer components {
|
@layer components {
|
||||||
/* Input field with non-editable prefix */
|
/* Input field with non-editable prefix */
|
||||||
.input-with-prefix {
|
.input-with-prefix {
|
||||||
|
|||||||
@@ -1,13 +1,15 @@
|
|||||||
import { Link, useParams } from "react-router-dom";
|
import { Link, useParams, useNavigate } from "react-router-dom";
|
||||||
import { useActions, useAction, useDeleteAction } from "@/hooks/useActions";
|
import { useActions, useAction, useDeleteAction } from "@/hooks/useActions";
|
||||||
import { useExecutions } from "@/hooks/useExecutions";
|
import { useExecutions } from "@/hooks/useExecutions";
|
||||||
import { useState, useMemo } from "react";
|
import { useState, useMemo } from "react";
|
||||||
import { ChevronDown, ChevronRight, Search, X, Play } from "lucide-react";
|
import { ChevronDown, ChevronRight, Search, X, Play, Plus } from "lucide-react";
|
||||||
import ExecuteActionModal from "@/components/common/ExecuteActionModal";
|
import ExecuteActionModal from "@/components/common/ExecuteActionModal";
|
||||||
import ErrorDisplay from "@/components/common/ErrorDisplay";
|
import ErrorDisplay from "@/components/common/ErrorDisplay";
|
||||||
|
import { extractProperties } from "@/components/common/ParamSchemaForm";
|
||||||
|
|
||||||
export default function ActionsPage() {
|
export default function ActionsPage() {
|
||||||
const { ref } = useParams<{ ref?: string }>();
|
const { ref } = useParams<{ ref?: string }>();
|
||||||
|
const navigate = useNavigate();
|
||||||
const { data, isLoading, error } = useActions();
|
const { data, isLoading, error } = useActions();
|
||||||
const actions = data?.data || [];
|
const actions = data?.data || [];
|
||||||
const [collapsedPacks, setCollapsedPacks] = useState<Set<string>>(new Set());
|
const [collapsedPacks, setCollapsedPacks] = useState<Set<string>>(new Set());
|
||||||
@@ -78,10 +80,22 @@ export default function ActionsPage() {
|
|||||||
{/* Left sidebar - Actions List */}
|
{/* Left sidebar - Actions List */}
|
||||||
<div className="w-96 border-r border-gray-200 overflow-y-auto bg-gray-50">
|
<div className="w-96 border-r border-gray-200 overflow-y-auto bg-gray-50">
|
||||||
<div className="p-4 border-b border-gray-200 bg-white sticky top-0 z-10">
|
<div className="p-4 border-b border-gray-200 bg-white sticky top-0 z-10">
|
||||||
|
<div className="flex items-center justify-between">
|
||||||
|
<div>
|
||||||
<h1 className="text-2xl font-bold">Actions</h1>
|
<h1 className="text-2xl font-bold">Actions</h1>
|
||||||
<p className="text-sm text-gray-600 mt-1">
|
<p className="text-sm text-gray-600 mt-1">
|
||||||
{filteredActions.length} of {actions.length} actions
|
{filteredActions.length} of {actions.length} actions
|
||||||
</p>
|
</p>
|
||||||
|
</div>
|
||||||
|
<button
|
||||||
|
onClick={() => navigate("/actions/workflows/new")}
|
||||||
|
className="flex items-center gap-1.5 px-3 py-2 bg-blue-600 text-white text-sm font-medium rounded-lg hover:bg-blue-700 transition-colors shadow-sm"
|
||||||
|
title="Create a new workflow action"
|
||||||
|
>
|
||||||
|
<Plus className="w-4 h-4" />
|
||||||
|
Workflow
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
{/* Search Bar */}
|
{/* Search Bar */}
|
||||||
<div className="mt-3 relative">
|
<div className="mt-3 relative">
|
||||||
@@ -261,8 +275,7 @@ function ActionDetail({ actionRef }: { actionRef: string }) {
|
|||||||
|
|
||||||
const executions = executionsData?.data || [];
|
const executions = executionsData?.data || [];
|
||||||
const paramSchema = action.data?.param_schema || {};
|
const paramSchema = action.data?.param_schema || {};
|
||||||
const properties = paramSchema.properties || {};
|
const properties = extractProperties(paramSchema);
|
||||||
const requiredFields = paramSchema.required || [];
|
|
||||||
const paramEntries = Object.entries(properties);
|
const paramEntries = Object.entries(properties);
|
||||||
|
|
||||||
return (
|
return (
|
||||||
@@ -420,11 +433,16 @@ function ActionDetail({ actionRef }: { actionRef: string }) {
|
|||||||
<span className="font-mono font-semibold text-sm">
|
<span className="font-mono font-semibold text-sm">
|
||||||
{key}
|
{key}
|
||||||
</span>
|
</span>
|
||||||
{requiredFields.includes(key) && (
|
{param?.required && (
|
||||||
<span className="text-xs px-2 py-0.5 bg-red-100 text-red-700 rounded">
|
<span className="text-xs px-2 py-0.5 bg-red-100 text-red-700 rounded">
|
||||||
Required
|
Required
|
||||||
</span>
|
</span>
|
||||||
)}
|
)}
|
||||||
|
{param?.secret && (
|
||||||
|
<span className="text-xs px-2 py-0.5 bg-yellow-100 text-yellow-700 rounded">
|
||||||
|
Secret
|
||||||
|
</span>
|
||||||
|
)}
|
||||||
<span className="text-xs px-2 py-0.5 bg-gray-100 text-gray-700 rounded">
|
<span className="text-xs px-2 py-0.5 bg-gray-100 text-gray-700 rounded">
|
||||||
{param?.type || "any"}
|
{param?.type || "any"}
|
||||||
</span>
|
</span>
|
||||||
|
|||||||
672
web/src/pages/actions/WorkflowBuilderPage.tsx
Normal file
672
web/src/pages/actions/WorkflowBuilderPage.tsx
Normal file
@@ -0,0 +1,672 @@
|
|||||||
|
import { useState, useCallback, useMemo, useRef } from "react";
|
||||||
|
import { useNavigate, useParams } from "react-router-dom";
|
||||||
|
import {
|
||||||
|
ArrowLeft,
|
||||||
|
Save,
|
||||||
|
AlertTriangle,
|
||||||
|
FileCode,
|
||||||
|
Code,
|
||||||
|
LayoutDashboard,
|
||||||
|
} from "lucide-react";
|
||||||
|
import yaml from "js-yaml";
|
||||||
|
import type { WorkflowYamlDefinition } from "@/types/workflow";
|
||||||
|
import ActionPalette from "@/components/workflows/ActionPalette";
|
||||||
|
import WorkflowCanvas from "@/components/workflows/WorkflowCanvas";
|
||||||
|
import type { EdgeHoverInfo } from "@/components/workflows/WorkflowEdges";
|
||||||
|
import TaskInspector from "@/components/workflows/TaskInspector";
|
||||||
|
import { useActions } from "@/hooks/useActions";
|
||||||
|
import { usePacks } from "@/hooks/usePacks";
|
||||||
|
import { useWorkflow } from "@/hooks/useWorkflows";
|
||||||
|
import {
|
||||||
|
useSaveWorkflowFile,
|
||||||
|
useUpdateWorkflowFile,
|
||||||
|
} from "@/hooks/useWorkflows";
|
||||||
|
import type {
|
||||||
|
WorkflowTask,
|
||||||
|
WorkflowBuilderState,
|
||||||
|
PaletteAction,
|
||||||
|
TransitionPreset,
|
||||||
|
} from "@/types/workflow";
|
||||||
|
import {
|
||||||
|
generateUniqueTaskName,
|
||||||
|
generateTaskId,
|
||||||
|
builderStateToDefinition,
|
||||||
|
definitionToBuilderState,
|
||||||
|
validateWorkflow,
|
||||||
|
addTransitionTarget,
|
||||||
|
removeTaskFromTransitions,
|
||||||
|
} from "@/types/workflow";
|
||||||
|
|
||||||
|
const INITIAL_STATE: WorkflowBuilderState = {
|
||||||
|
name: "",
|
||||||
|
label: "",
|
||||||
|
description: "",
|
||||||
|
version: "1.0.0",
|
||||||
|
packRef: "",
|
||||||
|
parameters: {},
|
||||||
|
output: {},
|
||||||
|
vars: {},
|
||||||
|
tasks: [],
|
||||||
|
tags: [],
|
||||||
|
enabled: true,
|
||||||
|
};
|
||||||
|
|
||||||
|
export default function WorkflowBuilderPage() {
|
||||||
|
const navigate = useNavigate();
|
||||||
|
const { ref: editRef } = useParams<{ ref?: string }>();
|
||||||
|
const isEditing = !!editRef;
|
||||||
|
|
||||||
|
// Data fetching
|
||||||
|
const { data: actionsData, isLoading: actionsLoading } = useActions({
|
||||||
|
pageSize: 200,
|
||||||
|
});
|
||||||
|
const { data: packsData } = usePacks({ pageSize: 100 });
|
||||||
|
const { data: existingWorkflow, isLoading: workflowLoading } = useWorkflow(
|
||||||
|
editRef || "",
|
||||||
|
);
|
||||||
|
|
||||||
|
// Mutations
|
||||||
|
const saveWorkflowFile = useSaveWorkflowFile();
|
||||||
|
const updateWorkflowFile = useUpdateWorkflowFile();
|
||||||
|
|
||||||
|
// Builder state
|
||||||
|
const [state, setState] = useState<WorkflowBuilderState>(INITIAL_STATE);
|
||||||
|
const [selectedTaskId, setSelectedTaskId] = useState<string | null>(null);
|
||||||
|
const [validationErrors, setValidationErrors] = useState<string[]>([]);
|
||||||
|
const [showErrors, setShowErrors] = useState(false);
|
||||||
|
const [saveError, setSaveError] = useState<string | null>(null);
|
||||||
|
const [saveSuccess, setSaveSuccess] = useState(false);
|
||||||
|
const [initialized, setInitialized] = useState(false);
|
||||||
|
const [showYamlPreview, setShowYamlPreview] = useState(false);
|
||||||
|
const [highlightedTransition, setHighlightedTransition] = useState<{
|
||||||
|
taskId: string;
|
||||||
|
transitionIndex: number;
|
||||||
|
} | null>(null);
|
||||||
|
const highlightTimeoutRef = useRef<ReturnType<typeof setTimeout> | null>(
|
||||||
|
null,
|
||||||
|
);
|
||||||
|
|
||||||
|
const handleEdgeHover = useCallback(
|
||||||
|
(info: EdgeHoverInfo | null) => {
|
||||||
|
// Clear any pending auto-clear timeout
|
||||||
|
if (highlightTimeoutRef.current) {
|
||||||
|
clearTimeout(highlightTimeoutRef.current);
|
||||||
|
highlightTimeoutRef.current = null;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (info) {
|
||||||
|
// Select the source task so TaskInspector opens for it
|
||||||
|
setSelectedTaskId(info.taskId);
|
||||||
|
setHighlightedTransition(info);
|
||||||
|
|
||||||
|
// Auto-clear highlight after 2 seconds so the flash animation plays once
|
||||||
|
highlightTimeoutRef.current = setTimeout(() => {
|
||||||
|
setHighlightedTransition(null);
|
||||||
|
highlightTimeoutRef.current = null;
|
||||||
|
}, 2000);
|
||||||
|
} else {
|
||||||
|
setHighlightedTransition(null);
|
||||||
|
}
|
||||||
|
},
|
||||||
|
[setSelectedTaskId],
|
||||||
|
);
|
||||||
|
|
||||||
|
// Initialize state from existing workflow (edit mode)
|
||||||
|
if (isEditing && existingWorkflow && !initialized && !workflowLoading) {
|
||||||
|
const workflow = existingWorkflow.data;
|
||||||
|
if (workflow) {
|
||||||
|
// Extract name from ref (e.g., "pack.name" -> "name")
|
||||||
|
const refParts = workflow.ref.split(".");
|
||||||
|
const name =
|
||||||
|
refParts.length > 1 ? refParts.slice(1).join(".") : workflow.ref;
|
||||||
|
|
||||||
|
const builderState = definitionToBuilderState(
|
||||||
|
{
|
||||||
|
ref: workflow.ref,
|
||||||
|
label: workflow.label,
|
||||||
|
description: workflow.description || undefined,
|
||||||
|
version: workflow.version,
|
||||||
|
parameters: workflow.param_schema || undefined,
|
||||||
|
output: workflow.out_schema || undefined,
|
||||||
|
tasks:
|
||||||
|
((workflow.definition as Record<string, unknown>)
|
||||||
|
?.tasks as WorkflowYamlDefinition["tasks"]) || [],
|
||||||
|
tags: workflow.tags,
|
||||||
|
},
|
||||||
|
workflow.pack_ref,
|
||||||
|
name,
|
||||||
|
);
|
||||||
|
setState(builderState);
|
||||||
|
setInitialized(true);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Derived data
|
||||||
|
const paletteActions: PaletteAction[] = useMemo(() => {
|
||||||
|
const actions = (actionsData?.data || []) as Array<{
|
||||||
|
id: number;
|
||||||
|
ref: string;
|
||||||
|
label: string;
|
||||||
|
description?: string;
|
||||||
|
pack_ref: string;
|
||||||
|
param_schema?: Record<string, unknown> | null;
|
||||||
|
out_schema?: Record<string, unknown> | null;
|
||||||
|
}>;
|
||||||
|
return actions.map((a) => ({
|
||||||
|
id: a.id,
|
||||||
|
ref: a.ref,
|
||||||
|
label: a.label,
|
||||||
|
description: a.description || "",
|
||||||
|
pack_ref: a.pack_ref,
|
||||||
|
param_schema: a.param_schema || null,
|
||||||
|
out_schema: a.out_schema || null,
|
||||||
|
}));
|
||||||
|
}, [actionsData]);
|
||||||
|
|
||||||
|
// Build action schema map for stripping defaults during serialization
|
||||||
|
const actionSchemaMap = useMemo(() => {
|
||||||
|
const map = new Map<string, Record<string, unknown> | null>();
|
||||||
|
for (const action of paletteActions) {
|
||||||
|
map.set(action.ref, action.param_schema);
|
||||||
|
}
|
||||||
|
return map;
|
||||||
|
}, [paletteActions]);
|
||||||
|
|
||||||
|
const packs = useMemo(() => {
|
||||||
|
return (packsData?.data || []) as Array<{
|
||||||
|
id: number;
|
||||||
|
ref: string;
|
||||||
|
label: string;
|
||||||
|
}>;
|
||||||
|
}, [packsData]);
|
||||||
|
|
||||||
|
const selectedTask = useMemo(
|
||||||
|
() => state.tasks.find((t) => t.id === selectedTaskId) || null,
|
||||||
|
[state.tasks, selectedTaskId],
|
||||||
|
);
|
||||||
|
|
||||||
|
const allTaskNames = useMemo(
|
||||||
|
() => state.tasks.map((t) => t.name),
|
||||||
|
[state.tasks],
|
||||||
|
);
|
||||||
|
|
||||||
|
// State updaters
|
||||||
|
const updateMetadata = useCallback(
|
||||||
|
(updates: Partial<WorkflowBuilderState>) => {
|
||||||
|
setState((prev) => ({ ...prev, ...updates }));
|
||||||
|
setSaveSuccess(false);
|
||||||
|
setSaveError(null);
|
||||||
|
},
|
||||||
|
[],
|
||||||
|
);
|
||||||
|
|
||||||
|
const handleAddTaskFromPalette = useCallback(
|
||||||
|
(action: PaletteAction) => {
|
||||||
|
// Generate a task name from the action ref
|
||||||
|
const baseName = action.ref.split(".").pop() || "task";
|
||||||
|
const name = generateUniqueTaskName(state.tasks, baseName);
|
||||||
|
|
||||||
|
// Position below existing tasks
|
||||||
|
let maxY = 0;
|
||||||
|
for (const task of state.tasks) {
|
||||||
|
if (task.position.y > maxY) {
|
||||||
|
maxY = task.position.y;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Pre-populate input from action's param_schema
|
||||||
|
const input: Record<string, unknown> = {};
|
||||||
|
if (action.param_schema && typeof action.param_schema === "object") {
|
||||||
|
for (const [key, param] of Object.entries(action.param_schema)) {
|
||||||
|
const meta = param as { default?: unknown };
|
||||||
|
input[key] = meta?.default !== undefined ? meta.default : "";
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const newTask: WorkflowTask = {
|
||||||
|
id: generateTaskId(),
|
||||||
|
name,
|
||||||
|
action: action.ref,
|
||||||
|
input,
|
||||||
|
position: {
|
||||||
|
x: 300,
|
||||||
|
y: state.tasks.length === 0 ? 60 : maxY + 160,
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
setState((prev) => ({
|
||||||
|
...prev,
|
||||||
|
tasks: [...prev.tasks, newTask],
|
||||||
|
}));
|
||||||
|
setSelectedTaskId(newTask.id);
|
||||||
|
setSaveSuccess(false);
|
||||||
|
},
|
||||||
|
[state.tasks],
|
||||||
|
);
|
||||||
|
|
||||||
|
const handleAddTask = useCallback((task: WorkflowTask) => {
|
||||||
|
setState((prev) => ({
|
||||||
|
...prev,
|
||||||
|
tasks: [...prev.tasks, task],
|
||||||
|
}));
|
||||||
|
setSaveSuccess(false);
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
const handleUpdateTask = useCallback(
|
||||||
|
(taskId: string, updates: Partial<WorkflowTask>) => {
|
||||||
|
setState((prev) => ({
|
||||||
|
...prev,
|
||||||
|
tasks: prev.tasks.map((t) =>
|
||||||
|
t.id === taskId ? { ...t, ...updates } : t,
|
||||||
|
),
|
||||||
|
}));
|
||||||
|
setSaveSuccess(false);
|
||||||
|
},
|
||||||
|
[],
|
||||||
|
);
|
||||||
|
|
||||||
|
const handleDeleteTask = useCallback(
|
||||||
|
(taskId: string) => {
|
||||||
|
const taskToDelete = state.tasks.find((t) => t.id === taskId);
|
||||||
|
if (!taskToDelete) return;
|
||||||
|
|
||||||
|
setState((prev) => ({
|
||||||
|
...prev,
|
||||||
|
tasks: prev.tasks
|
||||||
|
.filter((t) => t.id !== taskId)
|
||||||
|
.map((t) => {
|
||||||
|
// Clean up any transitions that reference the deleted task
|
||||||
|
const cleanedNext = removeTaskFromTransitions(
|
||||||
|
t.next,
|
||||||
|
taskToDelete.name,
|
||||||
|
);
|
||||||
|
if (cleanedNext !== t.next) {
|
||||||
|
return { ...t, next: cleanedNext };
|
||||||
|
}
|
||||||
|
return t;
|
||||||
|
}),
|
||||||
|
}));
|
||||||
|
|
||||||
|
if (selectedTaskId === taskId) {
|
||||||
|
setSelectedTaskId(null);
|
||||||
|
}
|
||||||
|
setSaveSuccess(false);
|
||||||
|
},
|
||||||
|
[state.tasks, selectedTaskId],
|
||||||
|
);
|
||||||
|
|
||||||
|
const handleSetConnection = useCallback(
|
||||||
|
(fromTaskId: string, preset: TransitionPreset, toTaskName: string) => {
|
||||||
|
setState((prev) => ({
|
||||||
|
...prev,
|
||||||
|
tasks: prev.tasks.map((t) => {
|
||||||
|
if (t.id !== fromTaskId) return t;
|
||||||
|
const next = addTransitionTarget(t, preset, toTaskName);
|
||||||
|
return { ...t, next };
|
||||||
|
}),
|
||||||
|
}));
|
||||||
|
setSaveSuccess(false);
|
||||||
|
},
|
||||||
|
[],
|
||||||
|
);
|
||||||
|
|
||||||
|
const handleSave = useCallback(async () => {
|
||||||
|
// Validate
|
||||||
|
const errors = validateWorkflow(state);
|
||||||
|
setValidationErrors(errors);
|
||||||
|
|
||||||
|
if (errors.length > 0) {
|
||||||
|
setShowErrors(true);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const definition = builderStateToDefinition(state, actionSchemaMap);
|
||||||
|
|
||||||
|
try {
|
||||||
|
setSaveError(null);
|
||||||
|
|
||||||
|
if (isEditing && editRef) {
|
||||||
|
await updateWorkflowFile.mutateAsync({
|
||||||
|
workflowRef: editRef,
|
||||||
|
data: {
|
||||||
|
name: state.name,
|
||||||
|
label: state.label,
|
||||||
|
description: state.description || undefined,
|
||||||
|
version: state.version,
|
||||||
|
pack_ref: state.packRef,
|
||||||
|
definition,
|
||||||
|
param_schema:
|
||||||
|
Object.keys(state.parameters).length > 0
|
||||||
|
? state.parameters
|
||||||
|
: undefined,
|
||||||
|
out_schema:
|
||||||
|
Object.keys(state.output).length > 0 ? state.output : undefined,
|
||||||
|
tags: state.tags.length > 0 ? state.tags : undefined,
|
||||||
|
enabled: state.enabled,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
await saveWorkflowFile.mutateAsync({
|
||||||
|
name: state.name,
|
||||||
|
label: state.label,
|
||||||
|
description: state.description || undefined,
|
||||||
|
version: state.version,
|
||||||
|
pack_ref: state.packRef,
|
||||||
|
definition,
|
||||||
|
param_schema:
|
||||||
|
Object.keys(state.parameters).length > 0
|
||||||
|
? state.parameters
|
||||||
|
: undefined,
|
||||||
|
out_schema:
|
||||||
|
Object.keys(state.output).length > 0 ? state.output : undefined,
|
||||||
|
tags: state.tags.length > 0 ? state.tags : undefined,
|
||||||
|
enabled: state.enabled,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
setSaveSuccess(true);
|
||||||
|
setTimeout(() => setSaveSuccess(false), 3000);
|
||||||
|
} catch (err: unknown) {
|
||||||
|
const error = err as { body?: { message?: string }; message?: string };
|
||||||
|
const message =
|
||||||
|
error?.body?.message || error?.message || "Failed to save workflow";
|
||||||
|
setSaveError(message);
|
||||||
|
}
|
||||||
|
}, [
|
||||||
|
state,
|
||||||
|
isEditing,
|
||||||
|
editRef,
|
||||||
|
saveWorkflowFile,
|
||||||
|
updateWorkflowFile,
|
||||||
|
actionSchemaMap,
|
||||||
|
]);
|
||||||
|
|
||||||
|
// YAML preview — generate proper YAML from builder state
|
||||||
|
const yamlPreview = useMemo(() => {
|
||||||
|
if (!showYamlPreview) return "";
|
||||||
|
try {
|
||||||
|
const definition = builderStateToDefinition(state, actionSchemaMap);
|
||||||
|
return yaml.dump(definition, {
|
||||||
|
indent: 2,
|
||||||
|
lineWidth: 120,
|
||||||
|
noRefs: true,
|
||||||
|
sortKeys: false,
|
||||||
|
quotingType: '"',
|
||||||
|
forceQuotes: false,
|
||||||
|
});
|
||||||
|
} catch {
|
||||||
|
return "# Error generating YAML preview";
|
||||||
|
}
|
||||||
|
}, [state, showYamlPreview, actionSchemaMap]);
|
||||||
|
|
||||||
|
const isSaving = saveWorkflowFile.isPending || updateWorkflowFile.isPending;
|
||||||
|
|
||||||
|
if (isEditing && workflowLoading) {
|
||||||
|
return (
|
||||||
|
<div className="flex items-center justify-center h-screen">
|
||||||
|
<div className="animate-spin rounded-full h-12 w-12 border-b-2 border-blue-600" />
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="h-[calc(100vh-4rem)] flex flex-col overflow-hidden">
|
||||||
|
{/* Top toolbar */}
|
||||||
|
<div className="flex-shrink-0 bg-white border-b border-gray-200 px-4 py-2.5">
|
||||||
|
<div className="flex items-center justify-between">
|
||||||
|
{/* Left section: Back + metadata */}
|
||||||
|
<div className="flex items-center gap-3 flex-1 min-w-0">
|
||||||
|
<button
|
||||||
|
onClick={() => navigate("/actions")}
|
||||||
|
className="p-1.5 rounded hover:bg-gray-100 text-gray-500 hover:text-gray-700 transition-colors flex-shrink-0"
|
||||||
|
title="Back to Actions"
|
||||||
|
>
|
||||||
|
<ArrowLeft className="w-5 h-5" />
|
||||||
|
</button>
|
||||||
|
|
||||||
|
<div className="flex items-center gap-2 flex-1 min-w-0">
|
||||||
|
{/* Pack selector */}
|
||||||
|
<select
|
||||||
|
value={state.packRef}
|
||||||
|
onChange={(e) => updateMetadata({ packRef: e.target.value })}
|
||||||
|
className="px-2 py-1.5 border border-gray-300 rounded text-sm focus:ring-2 focus:ring-blue-500 focus:border-blue-500 max-w-[140px]"
|
||||||
|
>
|
||||||
|
<option value="">Pack...</option>
|
||||||
|
{packs.map((pack) => (
|
||||||
|
<option key={pack.id} value={pack.ref}>
|
||||||
|
{pack.ref}
|
||||||
|
</option>
|
||||||
|
))}
|
||||||
|
</select>
|
||||||
|
|
||||||
|
<span className="text-gray-400 text-lg font-light">/</span>
|
||||||
|
|
||||||
|
{/* Workflow name */}
|
||||||
|
<input
|
||||||
|
type="text"
|
||||||
|
value={state.name}
|
||||||
|
onChange={(e) =>
|
||||||
|
updateMetadata({
|
||||||
|
name: e.target.value.replace(/[^a-zA-Z0-9_-]/g, "_"),
|
||||||
|
})
|
||||||
|
}
|
||||||
|
className="px-2 py-1.5 border border-gray-300 rounded text-sm font-mono focus:ring-2 focus:ring-blue-500 focus:border-blue-500 w-48"
|
||||||
|
placeholder="workflow_name"
|
||||||
|
/>
|
||||||
|
|
||||||
|
<span className="text-gray-400 text-lg font-light">—</span>
|
||||||
|
|
||||||
|
{/* Label */}
|
||||||
|
<input
|
||||||
|
type="text"
|
||||||
|
value={state.label}
|
||||||
|
onChange={(e) => updateMetadata({ label: e.target.value })}
|
||||||
|
className="px-2 py-1.5 border border-gray-300 rounded text-sm focus:ring-2 focus:ring-blue-500 focus:border-blue-500 flex-1 min-w-[160px] max-w-[300px]"
|
||||||
|
placeholder="Workflow Label"
|
||||||
|
/>
|
||||||
|
|
||||||
|
{/* Version */}
|
||||||
|
<input
|
||||||
|
type="text"
|
||||||
|
value={state.version}
|
||||||
|
onChange={(e) => updateMetadata({ version: e.target.value })}
|
||||||
|
className="px-2 py-1.5 border border-gray-300 rounded text-sm font-mono focus:ring-2 focus:ring-blue-500 focus:border-blue-500 w-20"
|
||||||
|
placeholder="1.0.0"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Right section: Actions */}
|
||||||
|
<div className="flex items-center gap-2 flex-shrink-0 ml-4">
|
||||||
|
{/* Validation errors badge */}
|
||||||
|
{validationErrors.length > 0 && (
|
||||||
|
<button
|
||||||
|
onClick={() => setShowErrors(!showErrors)}
|
||||||
|
className="flex items-center gap-1.5 px-2.5 py-1.5 text-xs font-medium text-amber-700 bg-amber-50 border border-amber-200 rounded hover:bg-amber-100 transition-colors"
|
||||||
|
>
|
||||||
|
<AlertTriangle className="w-3.5 h-3.5" />
|
||||||
|
{validationErrors.length} issue
|
||||||
|
{validationErrors.length !== 1 ? "s" : ""}
|
||||||
|
</button>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Raw YAML / Visual mode toggle */}
|
||||||
|
<div className="flex items-center bg-gray-100 rounded-lg p-0.5">
|
||||||
|
<button
|
||||||
|
onClick={() => setShowYamlPreview(false)}
|
||||||
|
className={`flex items-center gap-1.5 px-2.5 py-1 text-xs font-medium rounded-md transition-colors ${
|
||||||
|
!showYamlPreview
|
||||||
|
? "bg-white text-gray-900 shadow-sm"
|
||||||
|
: "text-gray-500 hover:text-gray-700"
|
||||||
|
}`}
|
||||||
|
title="Visual builder"
|
||||||
|
>
|
||||||
|
<LayoutDashboard className="w-3.5 h-3.5" />
|
||||||
|
Visual
|
||||||
|
</button>
|
||||||
|
<button
|
||||||
|
onClick={() => setShowYamlPreview(true)}
|
||||||
|
className={`flex items-center gap-1.5 px-2.5 py-1 text-xs font-medium rounded-md transition-colors ${
|
||||||
|
showYamlPreview
|
||||||
|
? "bg-white text-gray-900 shadow-sm"
|
||||||
|
: "text-gray-500 hover:text-gray-700"
|
||||||
|
}`}
|
||||||
|
title="Raw YAML view"
|
||||||
|
>
|
||||||
|
<Code className="w-3.5 h-3.5" />
|
||||||
|
Raw YAML
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Save success indicator */}
|
||||||
|
{saveSuccess && (
|
||||||
|
<span className="text-xs text-green-600 font-medium">
|
||||||
|
✓ Saved
|
||||||
|
</span>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Save error indicator */}
|
||||||
|
{saveError && (
|
||||||
|
<span
|
||||||
|
className="text-xs text-red-600 font-medium max-w-[200px] truncate"
|
||||||
|
title={saveError}
|
||||||
|
>
|
||||||
|
✗ {saveError}
|
||||||
|
</span>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Save button */}
|
||||||
|
<button
|
||||||
|
onClick={handleSave}
|
||||||
|
disabled={isSaving}
|
||||||
|
className="flex items-center gap-1.5 px-4 py-1.5 bg-blue-600 text-white text-sm font-medium rounded hover:bg-blue-700 disabled:opacity-50 disabled:cursor-not-allowed transition-colors shadow-sm"
|
||||||
|
>
|
||||||
|
<Save className="w-4 h-4" />
|
||||||
|
{isSaving ? "Saving..." : isEditing ? "Update" : "Save"}
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Description row (collapsible) */}
|
||||||
|
<div className="mt-2 flex items-center gap-2">
|
||||||
|
<input
|
||||||
|
type="text"
|
||||||
|
value={state.description}
|
||||||
|
onChange={(e) => updateMetadata({ description: e.target.value })}
|
||||||
|
className="flex-1 px-2 py-1 border border-gray-200 rounded text-xs text-gray-600 focus:ring-1 focus:ring-blue-500 focus:border-blue-500"
|
||||||
|
placeholder="Workflow description (optional)"
|
||||||
|
/>
|
||||||
|
<div className="flex items-center gap-1.5 flex-shrink-0">
|
||||||
|
<input
|
||||||
|
type="text"
|
||||||
|
value={state.tags.join(", ")}
|
||||||
|
onChange={(e) =>
|
||||||
|
updateMetadata({
|
||||||
|
tags: e.target.value
|
||||||
|
.split(",")
|
||||||
|
.map((t) => t.trim())
|
||||||
|
.filter(Boolean),
|
||||||
|
})
|
||||||
|
}
|
||||||
|
className="px-2 py-1 border border-gray-200 rounded text-xs text-gray-600 focus:ring-1 focus:ring-blue-500 focus:border-blue-500 w-40"
|
||||||
|
placeholder="Tags (comma-sep)"
|
||||||
|
/>
|
||||||
|
<label className="flex items-center gap-1 text-xs text-gray-600">
|
||||||
|
<input
|
||||||
|
type="checkbox"
|
||||||
|
checked={state.enabled}
|
||||||
|
onChange={(e) => updateMetadata({ enabled: e.target.checked })}
|
||||||
|
className="rounded border-gray-300 text-blue-600 focus:ring-blue-500"
|
||||||
|
/>
|
||||||
|
Enabled
|
||||||
|
</label>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Validation errors panel */}
|
||||||
|
{showErrors && validationErrors.length > 0 && (
|
||||||
|
<div className="flex-shrink-0 bg-amber-50 border-b border-amber-200 px-4 py-2">
|
||||||
|
<div className="flex items-start gap-2">
|
||||||
|
<AlertTriangle className="w-4 h-4 text-amber-600 mt-0.5 flex-shrink-0" />
|
||||||
|
<div className="flex-1">
|
||||||
|
<p className="text-xs font-medium text-amber-800 mb-1">
|
||||||
|
Please fix the following issues before saving:
|
||||||
|
</p>
|
||||||
|
<ul className="text-xs text-amber-700 space-y-0.5">
|
||||||
|
{validationErrors.map((error, index) => (
|
||||||
|
<li key={index}>• {error}</li>
|
||||||
|
))}
|
||||||
|
</ul>
|
||||||
|
</div>
|
||||||
|
<button
|
||||||
|
onClick={() => setShowErrors(false)}
|
||||||
|
className="text-amber-400 hover:text-amber-600"
|
||||||
|
>
|
||||||
|
×
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Main content area */}
|
||||||
|
<div className="flex-1 flex overflow-hidden">
|
||||||
|
{showYamlPreview ? (
|
||||||
|
/* Raw YAML mode — full-width YAML view */
|
||||||
|
<div className="flex-1 flex flex-col overflow-hidden bg-gray-900">
|
||||||
|
<div className="flex items-center gap-2 px-4 py-2 bg-gray-800 border-b border-gray-700 flex-shrink-0">
|
||||||
|
<FileCode className="w-4 h-4 text-gray-400" />
|
||||||
|
<span className="text-sm font-medium text-gray-300">
|
||||||
|
Workflow Definition
|
||||||
|
</span>
|
||||||
|
<span className="text-[10px] text-gray-500 ml-1">
|
||||||
|
(read-only preview of the generated YAML)
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
<pre className="flex-1 overflow-auto p-6 text-sm font-mono text-green-400 whitespace-pre leading-relaxed">
|
||||||
|
{yamlPreview}
|
||||||
|
</pre>
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<>
|
||||||
|
{/* Left: Action Palette */}
|
||||||
|
<ActionPalette
|
||||||
|
actions={paletteActions}
|
||||||
|
isLoading={actionsLoading}
|
||||||
|
onAddTask={handleAddTaskFromPalette}
|
||||||
|
/>
|
||||||
|
|
||||||
|
{/* Center: Canvas */}
|
||||||
|
<WorkflowCanvas
|
||||||
|
tasks={state.tasks}
|
||||||
|
selectedTaskId={selectedTaskId}
|
||||||
|
availableActions={paletteActions}
|
||||||
|
onSelectTask={setSelectedTaskId}
|
||||||
|
onUpdateTask={handleUpdateTask}
|
||||||
|
onDeleteTask={handleDeleteTask}
|
||||||
|
onAddTask={handleAddTask}
|
||||||
|
onSetConnection={handleSetConnection}
|
||||||
|
onEdgeHover={handleEdgeHover}
|
||||||
|
/>
|
||||||
|
|
||||||
|
{/* Right: Task Inspector */}
|
||||||
|
{selectedTask && (
|
||||||
|
<TaskInspector
|
||||||
|
task={selectedTask}
|
||||||
|
allTaskNames={allTaskNames}
|
||||||
|
availableActions={paletteActions}
|
||||||
|
onUpdate={handleUpdateTask}
|
||||||
|
onClose={() => setSelectedTaskId(null)}
|
||||||
|
highlightTransitionIndex={
|
||||||
|
highlightedTransition?.taskId === selectedTask.id
|
||||||
|
? highlightedTransition.transitionIndex
|
||||||
|
: null
|
||||||
|
}
|
||||||
|
/>
|
||||||
|
)}
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
@@ -20,7 +20,6 @@ export default function PackInstallPage() {
|
|||||||
const [formData, setFormData] = useState({
|
const [formData, setFormData] = useState({
|
||||||
source: "",
|
source: "",
|
||||||
refSpec: "",
|
refSpec: "",
|
||||||
force: false,
|
|
||||||
skipTests: false,
|
skipTests: false,
|
||||||
skipDeps: false,
|
skipDeps: false,
|
||||||
});
|
});
|
||||||
@@ -42,7 +41,6 @@ export default function PackInstallPage() {
|
|||||||
const result = await installPack.mutateAsync({
|
const result = await installPack.mutateAsync({
|
||||||
source: formData.source,
|
source: formData.source,
|
||||||
refSpec: formData.refSpec || undefined,
|
refSpec: formData.refSpec || undefined,
|
||||||
force: formData.force,
|
|
||||||
skipTests: formData.skipTests,
|
skipTests: formData.skipTests,
|
||||||
skipDeps: formData.skipDeps,
|
skipDeps: formData.skipDeps,
|
||||||
});
|
});
|
||||||
@@ -360,33 +358,6 @@ export default function PackInstallPage() {
|
|||||||
</p>
|
</p>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
{/* Force Installation */}
|
|
||||||
<div className="flex items-start">
|
|
||||||
<div className="flex items-center h-5">
|
|
||||||
<input
|
|
||||||
type="checkbox"
|
|
||||||
id="force"
|
|
||||||
name="force"
|
|
||||||
checked={formData.force}
|
|
||||||
onChange={handleChange}
|
|
||||||
className="w-4 h-4 text-blue-600 border-gray-300 rounded focus:ring-blue-500"
|
|
||||||
/>
|
|
||||||
</div>
|
|
||||||
<div className="ml-3">
|
|
||||||
<label
|
|
||||||
htmlFor="force"
|
|
||||||
className="text-sm font-medium text-gray-700"
|
|
||||||
>
|
|
||||||
Force Installation
|
|
||||||
</label>
|
|
||||||
<p className="text-sm text-gray-500">
|
|
||||||
Proceed with installation even if pack exists, dependencies
|
|
||||||
are missing, or tests fail. This will replace any existing
|
|
||||||
pack.
|
|
||||||
</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
|||||||
@@ -7,6 +7,7 @@ import {
|
|||||||
useDisableTrigger,
|
useDisableTrigger,
|
||||||
} from "@/hooks/useTriggers";
|
} from "@/hooks/useTriggers";
|
||||||
import { useState, useMemo } from "react";
|
import { useState, useMemo } from "react";
|
||||||
|
import { extractProperties } from "@/components/common/ParamSchemaForm";
|
||||||
import {
|
import {
|
||||||
ChevronDown,
|
ChevronDown,
|
||||||
ChevronRight,
|
ChevronRight,
|
||||||
@@ -328,13 +329,11 @@ function TriggerDetail({ triggerRef }: { triggerRef: string }) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
const paramSchema = trigger.data?.param_schema || {};
|
const paramSchema = trigger.data?.param_schema || {};
|
||||||
const properties = paramSchema.properties || {};
|
const properties = extractProperties(paramSchema);
|
||||||
const requiredFields = paramSchema.required || [];
|
|
||||||
const paramEntries = Object.entries(properties);
|
const paramEntries = Object.entries(properties);
|
||||||
|
|
||||||
const outSchema = trigger.data?.out_schema || {};
|
const outSchema = trigger.data?.out_schema || {};
|
||||||
const outProperties = outSchema.properties || {};
|
const outProperties = extractProperties(outSchema);
|
||||||
const outRequiredFields = outSchema.required || [];
|
|
||||||
const outEntries = Object.entries(outProperties);
|
const outEntries = Object.entries(outProperties);
|
||||||
|
|
||||||
return (
|
return (
|
||||||
@@ -496,11 +495,16 @@ function TriggerDetail({ triggerRef }: { triggerRef: string }) {
|
|||||||
<span className="font-mono font-semibold text-sm">
|
<span className="font-mono font-semibold text-sm">
|
||||||
{key}
|
{key}
|
||||||
</span>
|
</span>
|
||||||
{requiredFields.includes(key) && (
|
{param?.required && (
|
||||||
<span className="text-xs px-2 py-0.5 bg-red-100 text-red-700 rounded">
|
<span className="text-xs px-2 py-0.5 bg-red-100 text-red-700 rounded">
|
||||||
Required
|
Required
|
||||||
</span>
|
</span>
|
||||||
)}
|
)}
|
||||||
|
{param?.secret && (
|
||||||
|
<span className="text-xs px-2 py-0.5 bg-yellow-100 text-yellow-700 rounded">
|
||||||
|
Secret
|
||||||
|
</span>
|
||||||
|
)}
|
||||||
<span className="text-xs px-2 py-0.5 bg-gray-100 text-gray-700 rounded">
|
<span className="text-xs px-2 py-0.5 bg-gray-100 text-gray-700 rounded">
|
||||||
{param?.type || "any"}
|
{param?.type || "any"}
|
||||||
</span>
|
</span>
|
||||||
@@ -543,7 +547,7 @@ function TriggerDetail({ triggerRef }: { triggerRef: string }) {
|
|||||||
<span className="font-mono font-semibold text-sm">
|
<span className="font-mono font-semibold text-sm">
|
||||||
{key}
|
{key}
|
||||||
</span>
|
</span>
|
||||||
{outRequiredFields.includes(key) && (
|
{param?.required && (
|
||||||
<span className="text-xs px-2 py-0.5 bg-red-100 text-red-700 rounded">
|
<span className="text-xs px-2 py-0.5 bg-red-100 text-red-700 rounded">
|
||||||
Required
|
Required
|
||||||
</span>
|
</span>
|
||||||
|
|||||||
788
web/src/types/workflow.ts
Normal file
788
web/src/types/workflow.ts
Normal file
@@ -0,0 +1,788 @@
|
|||||||
|
/**
|
||||||
|
* Workflow Builder Types
|
||||||
|
*
|
||||||
|
* These types represent the client-side workflow builder state
|
||||||
|
* and map to the backend workflow YAML format.
|
||||||
|
*
|
||||||
|
* Uses the Orquesta-style task transition model where each task has a `next`
|
||||||
|
* list of transitions. Each transition specifies:
|
||||||
|
* - `when` — a condition expression (e.g., "{{ succeeded() }}", "{{ failed() }}")
|
||||||
|
* - `publish` — variables to publish into the workflow context
|
||||||
|
* - `do` — next tasks to invoke when the condition is met
|
||||||
|
*/
|
||||||
|
|
||||||
|
/** Position of a node on the canvas */
|
||||||
|
export interface NodePosition {
|
||||||
|
x: number;
|
||||||
|
y: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* A single task transition evaluated after task completion.
|
||||||
|
*
|
||||||
|
* Transitions are evaluated in order. When `when` is not defined,
|
||||||
|
* the transition is unconditional (fires on any completion).
|
||||||
|
*/
|
||||||
|
export interface TaskTransition {
|
||||||
|
/** Condition expression (e.g., "{{ succeeded() }}", "{{ failed() }}") */
|
||||||
|
when?: string;
|
||||||
|
/** Variables to publish into the workflow context on this transition */
|
||||||
|
publish?: PublishDirective[];
|
||||||
|
/** Next tasks to invoke when transition criteria is met */
|
||||||
|
do?: string[];
|
||||||
|
/** Custom display label for the transition (overrides auto-derived label) */
|
||||||
|
label?: string;
|
||||||
|
/** Custom color for the transition edge (CSS color string, e.g., "#ff6600") */
|
||||||
|
color?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
/** A task node in the workflow builder */
|
||||||
|
export interface WorkflowTask {
|
||||||
|
/** Unique ID for the builder (not persisted) */
|
||||||
|
id: string;
|
||||||
|
/** Task name (used in YAML) */
|
||||||
|
name: string;
|
||||||
|
/** Action reference (e.g., "core.echo") */
|
||||||
|
action: string;
|
||||||
|
/** Input parameters (template strings or values) */
|
||||||
|
input: Record<string, unknown>;
|
||||||
|
/** Task transitions — evaluated in order after task completes */
|
||||||
|
next?: TaskTransition[];
|
||||||
|
/** Delay in seconds before executing this task */
|
||||||
|
delay?: number;
|
||||||
|
/** Retry configuration */
|
||||||
|
retry?: RetryConfig;
|
||||||
|
/** Timeout in seconds */
|
||||||
|
timeout?: number;
|
||||||
|
/** With-items iteration expression */
|
||||||
|
with_items?: string;
|
||||||
|
/** Batch size for with-items */
|
||||||
|
batch_size?: number;
|
||||||
|
/** Concurrency limit for with-items */
|
||||||
|
concurrency?: number;
|
||||||
|
/** Join barrier count */
|
||||||
|
join?: number;
|
||||||
|
/** Visual position on canvas */
|
||||||
|
position: NodePosition;
|
||||||
|
}
|
||||||
|
|
||||||
|
/** Retry configuration */
|
||||||
|
export interface RetryConfig {
|
||||||
|
/** Number of retry attempts */
|
||||||
|
count: number;
|
||||||
|
/** Initial delay in seconds */
|
||||||
|
delay: number;
|
||||||
|
/** Backoff strategy */
|
||||||
|
backoff?: "constant" | "linear" | "exponential";
|
||||||
|
/** Maximum delay in seconds */
|
||||||
|
max_delay?: number;
|
||||||
|
/** Only retry on specific error conditions */
|
||||||
|
on_error?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
/** Variable publishing directive */
|
||||||
|
export type PublishDirective = Record<string, string>;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Transition handle presets for the visual builder.
|
||||||
|
*
|
||||||
|
* These map to common `when` expressions and provide a quick way
|
||||||
|
* to create transitions without typing expressions manually.
|
||||||
|
*/
|
||||||
|
export type TransitionPreset = "succeeded" | "failed" | "always";
|
||||||
|
|
||||||
|
/** The `when` expression for each preset (undefined = unconditional) */
|
||||||
|
export const PRESET_WHEN: Record<TransitionPreset, string | undefined> = {
|
||||||
|
succeeded: "{{ succeeded() }}",
|
||||||
|
failed: "{{ failed() }}",
|
||||||
|
always: undefined,
|
||||||
|
};
|
||||||
|
|
||||||
|
/** Human-readable labels for presets */
|
||||||
|
export const PRESET_LABELS: Record<TransitionPreset, string> = {
|
||||||
|
succeeded: "On Success",
|
||||||
|
failed: "On Failure",
|
||||||
|
always: "Always",
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Classify a `when` expression into an edge visual type.
|
||||||
|
* Used for edge coloring and labeling.
|
||||||
|
*/
|
||||||
|
export type EdgeType = "success" | "failure" | "complete" | "custom";
|
||||||
|
|
||||||
|
export function classifyTransitionWhen(when?: string): EdgeType {
|
||||||
|
if (!when) return "complete"; // unconditional
|
||||||
|
const lower = when.toLowerCase().replace(/\s+/g, "");
|
||||||
|
if (lower.includes("succeeded()")) return "success";
|
||||||
|
if (lower.includes("failed()")) return "failure";
|
||||||
|
return "custom";
|
||||||
|
}
|
||||||
|
|
||||||
|
/** Human-readable short label for a `when` expression */
|
||||||
|
export function transitionLabel(when?: string, customLabel?: string): string {
|
||||||
|
if (customLabel) return customLabel;
|
||||||
|
if (!when) return "always";
|
||||||
|
const lower = when.toLowerCase().replace(/\s+/g, "");
|
||||||
|
if (lower.includes("succeeded()")) return "succeeded";
|
||||||
|
if (lower.includes("failed()")) return "failed";
|
||||||
|
// Truncate custom expressions for display
|
||||||
|
if (when.length > 30) return when.slice(0, 27) + "...";
|
||||||
|
return when;
|
||||||
|
}
|
||||||
|
|
||||||
|
/** An edge/connection between two tasks */
|
||||||
|
export interface WorkflowEdge {
|
||||||
|
/** Source task ID */
|
||||||
|
from: string;
|
||||||
|
/** Target task ID */
|
||||||
|
to: string;
|
||||||
|
/** Visual type of transition (derived from `when`) */
|
||||||
|
type: EdgeType;
|
||||||
|
/** Label to display on the edge */
|
||||||
|
label?: string;
|
||||||
|
/** Index of the transition in the source task's `next` array */
|
||||||
|
transitionIndex: number;
|
||||||
|
/** Custom color override for the edge (CSS color string) */
|
||||||
|
color?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
/** Complete workflow builder state */
|
||||||
|
export interface WorkflowBuilderState {
|
||||||
|
/** Workflow name (used to derive ref and filename) */
|
||||||
|
name: string;
|
||||||
|
/** Human-readable label */
|
||||||
|
label: string;
|
||||||
|
/** Description */
|
||||||
|
description: string;
|
||||||
|
/** Semantic version */
|
||||||
|
version: string;
|
||||||
|
/** Pack reference this workflow belongs to */
|
||||||
|
packRef: string;
|
||||||
|
/** Input parameter schema (flat format) */
|
||||||
|
parameters: Record<string, ParamDefinition>;
|
||||||
|
/** Output schema (flat format) */
|
||||||
|
output: Record<string, ParamDefinition>;
|
||||||
|
/** Workflow-scoped variables */
|
||||||
|
vars: Record<string, unknown>;
|
||||||
|
/** Task nodes */
|
||||||
|
tasks: WorkflowTask[];
|
||||||
|
/** Tags */
|
||||||
|
tags: string[];
|
||||||
|
/** Whether the workflow is enabled */
|
||||||
|
enabled: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
/** Parameter definition in flat schema format */
|
||||||
|
export interface ParamDefinition {
|
||||||
|
type: string;
|
||||||
|
description?: string;
|
||||||
|
required?: boolean;
|
||||||
|
secret?: boolean;
|
||||||
|
default?: unknown;
|
||||||
|
enum?: string[];
|
||||||
|
}
|
||||||
|
|
||||||
|
/** Workflow definition as stored in the YAML file / API */
|
||||||
|
export interface WorkflowYamlDefinition {
|
||||||
|
ref: string;
|
||||||
|
label: string;
|
||||||
|
description?: string;
|
||||||
|
version: string;
|
||||||
|
parameters?: Record<string, unknown>;
|
||||||
|
output?: Record<string, unknown>;
|
||||||
|
vars?: Record<string, unknown>;
|
||||||
|
tasks: WorkflowYamlTask[];
|
||||||
|
output_map?: Record<string, string>;
|
||||||
|
tags?: string[];
|
||||||
|
}
|
||||||
|
|
||||||
|
/** Transition as represented in YAML format */
|
||||||
|
export interface WorkflowYamlTransition {
|
||||||
|
when?: string;
|
||||||
|
publish?: PublishDirective[];
|
||||||
|
do?: string[];
|
||||||
|
/** Custom display label for the transition */
|
||||||
|
label?: string;
|
||||||
|
/** Custom color for the transition edge */
|
||||||
|
color?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
/** Task as represented in YAML format */
|
||||||
|
export interface WorkflowYamlTask {
|
||||||
|
name: string;
|
||||||
|
action?: string;
|
||||||
|
input?: Record<string, unknown>;
|
||||||
|
delay?: number;
|
||||||
|
with_items?: string;
|
||||||
|
batch_size?: number;
|
||||||
|
concurrency?: number;
|
||||||
|
retry?: RetryConfig;
|
||||||
|
timeout?: number;
|
||||||
|
next?: WorkflowYamlTransition[];
|
||||||
|
join?: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
/** Request to save a workflow file to disk and sync to DB */
|
||||||
|
export interface SaveWorkflowFileRequest {
|
||||||
|
/** Workflow name (becomes filename: {name}.workflow.yaml) */
|
||||||
|
name: string;
|
||||||
|
/** Human-readable label */
|
||||||
|
label: string;
|
||||||
|
/** Description */
|
||||||
|
description?: string;
|
||||||
|
/** Semantic version */
|
||||||
|
version: string;
|
||||||
|
/** Pack reference */
|
||||||
|
pack_ref: string;
|
||||||
|
/** The full workflow definition as JSON */
|
||||||
|
definition: WorkflowYamlDefinition;
|
||||||
|
/** Parameter schema (flat format) */
|
||||||
|
param_schema?: Record<string, unknown>;
|
||||||
|
/** Output schema (flat format) */
|
||||||
|
out_schema?: Record<string, unknown>;
|
||||||
|
/** Tags */
|
||||||
|
tags?: string[];
|
||||||
|
/** Whether the workflow is enabled */
|
||||||
|
enabled?: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
/** An action summary used in the action palette */
|
||||||
|
export interface PaletteAction {
|
||||||
|
id: number;
|
||||||
|
ref: string;
|
||||||
|
label: string;
|
||||||
|
description: string;
|
||||||
|
pack_ref: string;
|
||||||
|
param_schema: Record<string, unknown> | null;
|
||||||
|
out_schema: Record<string, unknown> | null;
|
||||||
|
}
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// Conversion functions
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if two values are deeply equal for the purpose of default comparison.
|
||||||
|
* Handles primitives, arrays, and plain objects.
|
||||||
|
*/
|
||||||
|
function deepEqual(a: unknown, b: unknown): boolean {
|
||||||
|
if (a === b) return true;
|
||||||
|
if (a == null || b == null) return false;
|
||||||
|
if (typeof a !== typeof b) return false;
|
||||||
|
if (typeof a !== "object") return false;
|
||||||
|
if (Array.isArray(a) !== Array.isArray(b)) return false;
|
||||||
|
if (Array.isArray(a) && Array.isArray(b)) {
|
||||||
|
if (a.length !== b.length) return false;
|
||||||
|
return a.every((v, i) => deepEqual(v, b[i]));
|
||||||
|
}
|
||||||
|
const aObj = a as Record<string, unknown>;
|
||||||
|
const bObj = b as Record<string, unknown>;
|
||||||
|
const aKeys = Object.keys(aObj);
|
||||||
|
const bKeys = Object.keys(bObj);
|
||||||
|
if (aKeys.length !== bKeys.length) return false;
|
||||||
|
return aKeys.every((key) => deepEqual(aObj[key], bObj[key]));
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Strip input values that match their schema defaults.
|
||||||
|
* Returns a new object containing only user-modified values.
|
||||||
|
*/
|
||||||
|
export function stripDefaultInputs(
|
||||||
|
input: Record<string, unknown>,
|
||||||
|
paramSchema: Record<string, unknown> | null | undefined,
|
||||||
|
): Record<string, unknown> {
|
||||||
|
if (!paramSchema || typeof paramSchema !== "object") return input;
|
||||||
|
const result: Record<string, unknown> = {};
|
||||||
|
for (const [key, value] of Object.entries(input)) {
|
||||||
|
const schemaDef = paramSchema[key] as
|
||||||
|
| { default?: unknown }
|
||||||
|
| null
|
||||||
|
| undefined;
|
||||||
|
if (
|
||||||
|
schemaDef &&
|
||||||
|
schemaDef.default !== undefined &&
|
||||||
|
deepEqual(value, schemaDef.default)
|
||||||
|
) {
|
||||||
|
continue; // skip — matches default
|
||||||
|
}
|
||||||
|
// Also skip empty strings when there's no default (user never filled it in)
|
||||||
|
if (value === "" && (!schemaDef || schemaDef.default === undefined)) {
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
result[key] = value;
|
||||||
|
}
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Convert builder state to YAML definition for saving.
|
||||||
|
*
|
||||||
|
* When `actionSchemas` is provided (a map of action ref → param_schema),
|
||||||
|
* input values that match their schema defaults are omitted from the output
|
||||||
|
* so only user-modified parameters appear in the generated YAML.
|
||||||
|
*/
|
||||||
|
export function builderStateToDefinition(
|
||||||
|
state: WorkflowBuilderState,
|
||||||
|
actionSchemas?: Map<string, Record<string, unknown> | null>,
|
||||||
|
): WorkflowYamlDefinition {
|
||||||
|
const tasks: WorkflowYamlTask[] = state.tasks.map((task) => {
|
||||||
|
const yamlTask: WorkflowYamlTask = {
|
||||||
|
name: task.name,
|
||||||
|
};
|
||||||
|
|
||||||
|
if (task.action) {
|
||||||
|
yamlTask.action = task.action;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Filter input: strip values that match schema defaults
|
||||||
|
const schema = actionSchemas?.get(task.action);
|
||||||
|
const effectiveInput = schema
|
||||||
|
? stripDefaultInputs(task.input, schema)
|
||||||
|
: task.input;
|
||||||
|
if (Object.keys(effectiveInput).length > 0) {
|
||||||
|
yamlTask.input = effectiveInput;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (task.delay) yamlTask.delay = task.delay;
|
||||||
|
if (task.with_items) yamlTask.with_items = task.with_items;
|
||||||
|
if (task.batch_size) yamlTask.batch_size = task.batch_size;
|
||||||
|
if (task.concurrency) yamlTask.concurrency = task.concurrency;
|
||||||
|
if (task.retry) yamlTask.retry = task.retry;
|
||||||
|
if (task.timeout) yamlTask.timeout = task.timeout;
|
||||||
|
if (task.join) yamlTask.join = task.join;
|
||||||
|
|
||||||
|
// Serialize transitions as `next` array
|
||||||
|
if (task.next && task.next.length > 0) {
|
||||||
|
yamlTask.next = task.next.map((t) => {
|
||||||
|
const yt: WorkflowYamlTransition = {};
|
||||||
|
if (t.when) yt.when = t.when;
|
||||||
|
if (t.publish && t.publish.length > 0) yt.publish = t.publish;
|
||||||
|
if (t.do && t.do.length > 0) yt.do = t.do;
|
||||||
|
if (t.label) yt.label = t.label;
|
||||||
|
if (t.color) yt.color = t.color;
|
||||||
|
return yt;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
return yamlTask;
|
||||||
|
});
|
||||||
|
|
||||||
|
const definition: WorkflowYamlDefinition = {
|
||||||
|
ref: `${state.packRef}.${state.name}`,
|
||||||
|
label: state.label,
|
||||||
|
version: state.version,
|
||||||
|
tasks,
|
||||||
|
};
|
||||||
|
|
||||||
|
if (state.description) {
|
||||||
|
definition.description = state.description;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (Object.keys(state.parameters).length > 0) {
|
||||||
|
definition.parameters = state.parameters;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (Object.keys(state.output).length > 0) {
|
||||||
|
definition.output = state.output;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (Object.keys(state.vars).length > 0) {
|
||||||
|
definition.vars = state.vars;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (state.tags.length > 0) {
|
||||||
|
definition.tags = state.tags;
|
||||||
|
}
|
||||||
|
|
||||||
|
return definition;
|
||||||
|
}
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// Legacy format conversion helpers
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
/** Legacy task fields that may appear in older workflow definitions */
|
||||||
|
interface LegacyYamlTask extends WorkflowYamlTask {
|
||||||
|
on_success?: string;
|
||||||
|
on_failure?: string;
|
||||||
|
on_complete?: string;
|
||||||
|
on_timeout?: string;
|
||||||
|
decision?: { when?: string; next: string; default?: boolean }[];
|
||||||
|
publish?: PublishDirective[];
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Convert legacy on_success/on_failure/etc fields to `next` transitions.
|
||||||
|
* This allows the builder to load workflows saved in the old format.
|
||||||
|
*/
|
||||||
|
function legacyTransitionsToNext(task: LegacyYamlTask): TaskTransition[] {
|
||||||
|
const transitions: TaskTransition[] = [];
|
||||||
|
|
||||||
|
if (task.on_success) {
|
||||||
|
transitions.push({
|
||||||
|
when: "{{ succeeded() }}",
|
||||||
|
do: [task.on_success],
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
if (task.on_failure) {
|
||||||
|
transitions.push({
|
||||||
|
when: "{{ failed() }}",
|
||||||
|
do: [task.on_failure],
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
if (task.on_complete) {
|
||||||
|
// on_complete = unconditional (fires regardless of success/failure)
|
||||||
|
transitions.push({
|
||||||
|
do: [task.on_complete],
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
if (task.on_timeout) {
|
||||||
|
transitions.push({
|
||||||
|
when: "{{ timed_out() }}",
|
||||||
|
do: [task.on_timeout],
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Convert legacy decision branches
|
||||||
|
if (task.decision) {
|
||||||
|
for (const branch of task.decision) {
|
||||||
|
transitions.push({
|
||||||
|
when: branch.when || undefined,
|
||||||
|
do: [branch.next],
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// If legacy task had publish but no transitions, create a publish-only transition
|
||||||
|
if (task.publish && task.publish.length > 0 && transitions.length === 0) {
|
||||||
|
transitions.push({
|
||||||
|
when: "{{ succeeded() }}",
|
||||||
|
publish: task.publish,
|
||||||
|
});
|
||||||
|
} else if (
|
||||||
|
task.publish &&
|
||||||
|
task.publish.length > 0 &&
|
||||||
|
transitions.length > 0
|
||||||
|
) {
|
||||||
|
// Attach publish to the first succeeded transition, or the first transition
|
||||||
|
const succeededIdx = transitions.findIndex(
|
||||||
|
(t) => t.when && t.when.toLowerCase().includes("succeeded()"),
|
||||||
|
);
|
||||||
|
const idx = succeededIdx >= 0 ? succeededIdx : 0;
|
||||||
|
transitions[idx].publish = task.publish;
|
||||||
|
}
|
||||||
|
|
||||||
|
return transitions;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Convert a YAML definition back to builder state (for editing existing workflows).
|
||||||
|
* Supports both new `next` format and legacy `on_success`/`on_failure` format.
|
||||||
|
*/
|
||||||
|
export function definitionToBuilderState(
|
||||||
|
definition: WorkflowYamlDefinition,
|
||||||
|
packRef: string,
|
||||||
|
name: string,
|
||||||
|
): WorkflowBuilderState {
|
||||||
|
const tasks: WorkflowTask[] = (definition.tasks || []).map(
|
||||||
|
(rawTask, index) => {
|
||||||
|
const task = rawTask as LegacyYamlTask;
|
||||||
|
|
||||||
|
// Determine transitions: prefer `next` if present, otherwise convert legacy fields
|
||||||
|
let next: TaskTransition[] | undefined;
|
||||||
|
if (task.next && task.next.length > 0) {
|
||||||
|
next = task.next.map((t) => ({
|
||||||
|
when: t.when,
|
||||||
|
publish: t.publish,
|
||||||
|
do: t.do,
|
||||||
|
label: t.label,
|
||||||
|
color: t.color,
|
||||||
|
}));
|
||||||
|
} else {
|
||||||
|
const converted = legacyTransitionsToNext(task);
|
||||||
|
next = converted.length > 0 ? converted : undefined;
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
id: `task-${index}-${Date.now()}`,
|
||||||
|
name: task.name,
|
||||||
|
action: task.action || "",
|
||||||
|
input: task.input || {},
|
||||||
|
next,
|
||||||
|
delay: task.delay,
|
||||||
|
retry: task.retry,
|
||||||
|
timeout: task.timeout,
|
||||||
|
with_items: task.with_items,
|
||||||
|
batch_size: task.batch_size,
|
||||||
|
concurrency: task.concurrency,
|
||||||
|
join: task.join,
|
||||||
|
position: {
|
||||||
|
x: 300,
|
||||||
|
y: 80 + index * 160,
|
||||||
|
},
|
||||||
|
};
|
||||||
|
},
|
||||||
|
);
|
||||||
|
|
||||||
|
return {
|
||||||
|
name,
|
||||||
|
label: definition.label,
|
||||||
|
description: definition.description || "",
|
||||||
|
version: definition.version,
|
||||||
|
packRef,
|
||||||
|
parameters: (definition.parameters || {}) as Record<
|
||||||
|
string,
|
||||||
|
ParamDefinition
|
||||||
|
>,
|
||||||
|
output: (definition.output || {}) as Record<string, ParamDefinition>,
|
||||||
|
vars: definition.vars || {},
|
||||||
|
tasks,
|
||||||
|
tags: definition.tags || [],
|
||||||
|
enabled: true,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// Edge derivation
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Derive visual edges from task transitions.
|
||||||
|
*
|
||||||
|
* Each entry in a task's `next` array can target multiple tasks via `do`.
|
||||||
|
* Each target produces a separate edge with the same visual type/label.
|
||||||
|
*/
|
||||||
|
export function deriveEdges(tasks: WorkflowTask[]): WorkflowEdge[] {
|
||||||
|
const edges: WorkflowEdge[] = [];
|
||||||
|
const taskNameToId = new Map<string, string>();
|
||||||
|
|
||||||
|
for (const task of tasks) {
|
||||||
|
taskNameToId.set(task.name, task.id);
|
||||||
|
}
|
||||||
|
|
||||||
|
for (const task of tasks) {
|
||||||
|
if (!task.next) continue;
|
||||||
|
|
||||||
|
for (let ti = 0; ti < task.next.length; ti++) {
|
||||||
|
const transition = task.next[ti];
|
||||||
|
const edgeType = classifyTransitionWhen(transition.when);
|
||||||
|
const label = transitionLabel(transition.when, transition.label);
|
||||||
|
|
||||||
|
if (transition.do) {
|
||||||
|
for (const targetName of transition.do) {
|
||||||
|
const targetId = taskNameToId.get(targetName);
|
||||||
|
if (targetId) {
|
||||||
|
edges.push({
|
||||||
|
from: task.id,
|
||||||
|
to: targetId,
|
||||||
|
type: edgeType,
|
||||||
|
label,
|
||||||
|
transitionIndex: ti,
|
||||||
|
color: transition.color,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return edges;
|
||||||
|
}
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// Task transition helpers
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Find or create a transition in a task's `next` array that matches a preset.
|
||||||
|
*
|
||||||
|
* If a transition with a matching `when` expression already exists, returns
|
||||||
|
* its index. Otherwise, appends a new transition and returns the new index.
|
||||||
|
*/
|
||||||
|
export function findOrCreateTransition(
|
||||||
|
task: WorkflowTask,
|
||||||
|
preset: TransitionPreset,
|
||||||
|
): { next: TaskTransition[]; index: number } {
|
||||||
|
const whenExpr = PRESET_WHEN[preset];
|
||||||
|
const next = [...(task.next || [])];
|
||||||
|
|
||||||
|
// Look for an existing transition with the same `when`
|
||||||
|
const existingIndex = next.findIndex((t) => {
|
||||||
|
if (whenExpr === undefined) return t.when === undefined;
|
||||||
|
return (
|
||||||
|
t.when?.toLowerCase().replace(/\s+/g, "") ===
|
||||||
|
whenExpr.toLowerCase().replace(/\s+/g, "")
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
if (existingIndex >= 0) {
|
||||||
|
return { next, index: existingIndex };
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create new transition
|
||||||
|
const newTransition: TaskTransition = {};
|
||||||
|
if (whenExpr) newTransition.when = whenExpr;
|
||||||
|
next.push(newTransition);
|
||||||
|
return { next, index: next.length - 1 };
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Add a target task to a transition's `do` list.
|
||||||
|
* If the target is already in the list, this is a no-op.
|
||||||
|
* Returns the updated `next` array.
|
||||||
|
*/
|
||||||
|
export function addTransitionTarget(
|
||||||
|
task: WorkflowTask,
|
||||||
|
preset: TransitionPreset,
|
||||||
|
targetTaskName: string,
|
||||||
|
): TaskTransition[] {
|
||||||
|
const { next, index } = findOrCreateTransition(task, preset);
|
||||||
|
const transition = { ...next[index] };
|
||||||
|
const doList = [...(transition.do || [])];
|
||||||
|
|
||||||
|
if (!doList.includes(targetTaskName)) {
|
||||||
|
doList.push(targetTaskName);
|
||||||
|
}
|
||||||
|
|
||||||
|
transition.do = doList;
|
||||||
|
next[index] = transition;
|
||||||
|
return next;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Remove all references to a task name from all transitions.
|
||||||
|
* Cleans up transitions that become empty (no `do` and no `publish`).
|
||||||
|
*/
|
||||||
|
export function removeTaskFromTransitions(
|
||||||
|
next: TaskTransition[] | undefined,
|
||||||
|
taskName: string,
|
||||||
|
): TaskTransition[] | undefined {
|
||||||
|
if (!next) return undefined;
|
||||||
|
|
||||||
|
const cleaned = next
|
||||||
|
.map((t) => {
|
||||||
|
if (!t.do || !t.do.includes(taskName)) return t;
|
||||||
|
const newDo = t.do.filter((name) => name !== taskName);
|
||||||
|
return { ...t, do: newDo.length > 0 ? newDo : undefined };
|
||||||
|
})
|
||||||
|
// Keep transitions that still have `do` targets or `publish` directives
|
||||||
|
.filter(
|
||||||
|
(t) => (t.do && t.do.length > 0) || (t.publish && t.publish.length > 0),
|
||||||
|
);
|
||||||
|
|
||||||
|
return cleaned.length > 0 ? cleaned : undefined;
|
||||||
|
}
|
||||||
|
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
// Utility functions
|
||||||
|
// ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Generate a unique task ID
|
||||||
|
*/
|
||||||
|
export function generateTaskId(): string {
|
||||||
|
return `task-${Date.now()}-${Math.random().toString(36).substring(2, 9)}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Create a new empty task
|
||||||
|
*/
|
||||||
|
export function createEmptyTask(
|
||||||
|
name: string,
|
||||||
|
position: NodePosition,
|
||||||
|
): WorkflowTask {
|
||||||
|
return {
|
||||||
|
id: generateTaskId(),
|
||||||
|
name,
|
||||||
|
action: "",
|
||||||
|
input: {},
|
||||||
|
position,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Generate a unique task name that doesn't conflict with existing tasks
|
||||||
|
*/
|
||||||
|
export function generateUniqueTaskName(
|
||||||
|
existingTasks: WorkflowTask[],
|
||||||
|
baseName: string = "task",
|
||||||
|
): string {
|
||||||
|
const existingNames = new Set(existingTasks.map((t) => t.name));
|
||||||
|
let counter = existingTasks.length + 1;
|
||||||
|
let name = `${baseName}_${counter}`;
|
||||||
|
while (existingNames.has(name)) {
|
||||||
|
counter++;
|
||||||
|
name = `${baseName}_${counter}`;
|
||||||
|
}
|
||||||
|
return name;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Validate a workflow builder state and return any errors
|
||||||
|
*/
|
||||||
|
export function validateWorkflow(state: WorkflowBuilderState): string[] {
|
||||||
|
const errors: string[] = [];
|
||||||
|
|
||||||
|
if (!state.name.trim()) {
|
||||||
|
errors.push("Workflow name is required");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!state.label.trim()) {
|
||||||
|
errors.push("Workflow label is required");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!state.version.trim()) {
|
||||||
|
errors.push("Workflow version is required");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!state.packRef) {
|
||||||
|
errors.push("Pack reference is required");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (state.tasks.length === 0) {
|
||||||
|
errors.push("Workflow must have at least one task");
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check for duplicate task names
|
||||||
|
const taskNames = new Set<string>();
|
||||||
|
for (const task of state.tasks) {
|
||||||
|
if (taskNames.has(task.name)) {
|
||||||
|
errors.push(`Duplicate task name: "${task.name}"`);
|
||||||
|
}
|
||||||
|
taskNames.add(task.name);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check that tasks have an action reference
|
||||||
|
for (const task of state.tasks) {
|
||||||
|
if (!task.action) {
|
||||||
|
errors.push(`Task "${task.name}" must have an action assigned`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check that all transition targets reference existing tasks
|
||||||
|
for (const task of state.tasks) {
|
||||||
|
if (!task.next) continue;
|
||||||
|
|
||||||
|
for (let ti = 0; ti < task.next.length; ti++) {
|
||||||
|
const transition = task.next[ti];
|
||||||
|
if (!transition.do) continue;
|
||||||
|
|
||||||
|
for (const targetName of transition.do) {
|
||||||
|
if (!taskNames.has(targetName)) {
|
||||||
|
const whenLabel = transition.when
|
||||||
|
? ` (when: ${transition.when})`
|
||||||
|
: " (always)";
|
||||||
|
errors.push(
|
||||||
|
`Task "${task.name}" transition${whenLabel} references non-existent task "${targetName}"`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return errors;
|
||||||
|
}
|
||||||
133
work-summary/2026-02-04-orquesta-style-transitions.md
Normal file
133
work-summary/2026-02-04-orquesta-style-transitions.md
Normal file
@@ -0,0 +1,133 @@
|
|||||||
|
# Orquesta-Style Task Transition Model
|
||||||
|
|
||||||
|
**Date**: 2026-02-04
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
Refactored the workflow builder's task transition model from flat `on_success`/`on_failure`/`on_complete`/`on_timeout` fields to an Orquesta-style ordered `next` array of transitions. Each transition can specify a `when` condition, `publish` directives, and multiple `do` targets — enabling far more expressive workflow definitions.
|
||||||
|
|
||||||
|
Also added visual drag handles to task nodes in the workflow builder for creating transitions via drag-and-drop.
|
||||||
|
|
||||||
|
## Motivation
|
||||||
|
|
||||||
|
The previous model only allowed a single target task per transition type and had no way to:
|
||||||
|
- Route to multiple tasks from a single transition
|
||||||
|
- Attach per-transition variable publishing
|
||||||
|
- Use custom condition expressions beyond the four fixed types
|
||||||
|
- Publish variables without transitioning to another task
|
||||||
|
|
||||||
|
The Orquesta model (from StackStorm) solves all of these with a simple, ordered list of conditional transitions.
|
||||||
|
|
||||||
|
## Changes
|
||||||
|
|
||||||
|
### Frontend (`web/`)
|
||||||
|
|
||||||
|
#### `web/src/types/workflow.ts`
|
||||||
|
- **Added** `TaskTransition` type: `{ when?: string; publish?: PublishDirective[]; do?: string[] }`
|
||||||
|
- **Added** `TransitionPreset` type and constants (`PRESET_WHEN`, `PRESET_LABELS`) for the three common quick-access patterns: succeeded, failed, always
|
||||||
|
- **Added** `classifyTransitionWhen()` and `transitionLabel()` for edge visualization
|
||||||
|
- **Added** `EdgeType` — simplified to `"success" | "failure" | "complete" | "custom"`
|
||||||
|
- **Added** helper functions: `findOrCreateTransition()`, `addTransitionTarget()`, `removeTaskFromTransitions()`
|
||||||
|
- **Removed** `on_success`, `on_failure`, `on_complete`, `on_timeout`, `decision`, `publish` fields from `WorkflowTask`
|
||||||
|
- **Removed** `DecisionBranch` type (subsumed by `TaskTransition.when`)
|
||||||
|
- **Updated** `WorkflowYamlTask` to use `next?: WorkflowYamlTransition[]`
|
||||||
|
- **Updated** `builderStateToDefinition()` to serialize `next` array
|
||||||
|
- **Updated** `definitionToBuilderState()` to load both new `next` format and legacy flat fields (auto-converts)
|
||||||
|
- **Updated** `deriveEdges()` to iterate `task.next[].do[]`
|
||||||
|
- **Updated** `validateWorkflow()` to validate `next[].do[]` targets
|
||||||
|
|
||||||
|
#### `web/src/components/workflows/TaskNode.tsx`
|
||||||
|
- **Redesigned output handles**: Three color-coded drag handles at bottom (green=succeeded, red=failed, gray=always)
|
||||||
|
- **Added input handle**: Neutral circle at top center as drop target, highlights purple during active connection
|
||||||
|
- **Removed** old footer link-icon buttons
|
||||||
|
- **Added** transition summary in node body (e.g., "2 targets via 1 transition")
|
||||||
|
- **Added** custom transitions badge
|
||||||
|
|
||||||
|
#### `web/src/components/workflows/WorkflowEdges.tsx`
|
||||||
|
- Updated edge colors for new `EdgeType` values
|
||||||
|
- Preview line color now uses `TransitionPreset` mapping
|
||||||
|
- Dynamic label width based on text content
|
||||||
|
|
||||||
|
#### `web/src/components/workflows/WorkflowCanvas.tsx`
|
||||||
|
- Updated to use `TransitionPreset` instead of `TransitionType`
|
||||||
|
- Added `onMouseUp` handler for drag-to-cancel on canvas background
|
||||||
|
|
||||||
|
#### `web/src/components/workflows/TaskInspector.tsx`
|
||||||
|
- **Replaced** four fixed `TransitionField` dropdowns with a dynamic transition list editor
|
||||||
|
- Each transition card shows: `when` expression (editable), `do` target list (add/remove), `publish` key-value pairs (add/remove)
|
||||||
|
- Quick-set buttons for common `when` presets (On Success, On Failure, Always)
|
||||||
|
- Add transition buttons: "On Success", "On Failure", "Custom transition"
|
||||||
|
- **Moved** publish variables from task-level section to per-transition
|
||||||
|
- **Removed** old `TransitionField` component
|
||||||
|
- **Added** Join section for barrier configuration
|
||||||
|
|
||||||
|
#### `web/src/pages/actions/WorkflowBuilderPage.tsx`
|
||||||
|
- Updated `handleSetConnection` to use `addTransitionTarget()` with `TransitionPreset`
|
||||||
|
- Updated `handleDeleteTask` to use `removeTaskFromTransitions()`
|
||||||
|
|
||||||
|
### Backend (`crates/`)
|
||||||
|
|
||||||
|
#### `crates/common/src/workflow/parser.rs`
|
||||||
|
- **Added** `TaskTransition` struct: `{ when, publish, do }`
|
||||||
|
- **Added** `Task::normalize_transitions()` — converts legacy fields into `next` array
|
||||||
|
- **Added** `Task::all_transition_targets()` — collects all referenced task names
|
||||||
|
- **Updated** `parse_workflow_yaml()` to call `normalize_all_transitions()` after parsing
|
||||||
|
- **Updated** `validate_task()` to use `all_transition_targets()` instead of checking individual fields
|
||||||
|
- Legacy fields (`on_success`, `on_failure`, `on_complete`, `on_timeout`, `decision`) retained for deserialization but cleared after normalization
|
||||||
|
- **Added** 12 new tests covering both new and legacy formats
|
||||||
|
|
||||||
|
#### `crates/common/src/workflow/validator.rs`
|
||||||
|
- Updated `build_graph()` and `find_entry_points()` to use `task.all_transition_targets()`
|
||||||
|
|
||||||
|
#### `crates/common/src/workflow/mod.rs`
|
||||||
|
- Exported new `TaskTransition` type
|
||||||
|
|
||||||
|
#### `crates/executor/src/workflow/graph.rs`
|
||||||
|
- **Replaced** `TaskTransitions` struct (flat fields) with `Vec<GraphTransition>`
|
||||||
|
- **Added** `GraphTransition`: `{ when, publish: Vec<PublishVar>, do_tasks: Vec<String> }`
|
||||||
|
- **Added** `PublishVar`: `{ name, expression }` — preserves both key and value
|
||||||
|
- **Added** `TransitionKind` enum and `GraphTransition::kind()` classifier
|
||||||
|
- **Added** `TaskGraph::matching_transitions()` — returns full transition objects for coordinators
|
||||||
|
- **Added** `TaskGraph::all_transition_targets()` — all target names from a task
|
||||||
|
- **Updated** `next_tasks()` to evaluate transitions by `TransitionKind`
|
||||||
|
- **Updated** `compute_inbound_edges()` to iterate `GraphTransition.do_tasks`
|
||||||
|
- **Updated** `extract_publish_vars()` to return `Vec<PublishVar>` instead of `Vec<String>`
|
||||||
|
- **Added** 12 new tests
|
||||||
|
|
||||||
|
#### `crates/executor/src/workflow/task_executor.rs`
|
||||||
|
- Updated variable publishing to extract from matching transitions instead of removed `task.publish` field
|
||||||
|
|
||||||
|
## YAML Format
|
||||||
|
|
||||||
|
### New (canonical)
|
||||||
|
```yaml
|
||||||
|
tasks:
|
||||||
|
- name: task1
|
||||||
|
action: core.echo
|
||||||
|
next:
|
||||||
|
- when: "{{ succeeded() }}"
|
||||||
|
publish:
|
||||||
|
- result: "{{ result() }}"
|
||||||
|
do:
|
||||||
|
- task2
|
||||||
|
- log
|
||||||
|
- when: "{{ failed() }}"
|
||||||
|
do:
|
||||||
|
- error_handler
|
||||||
|
```
|
||||||
|
|
||||||
|
### Legacy (still parsed, auto-converted)
|
||||||
|
```yaml
|
||||||
|
tasks:
|
||||||
|
- name: task1
|
||||||
|
action: core.echo
|
||||||
|
on_success: task2
|
||||||
|
on_failure: error_handler
|
||||||
|
```
|
||||||
|
|
||||||
|
## Test Results
|
||||||
|
|
||||||
|
- **Parser tests**: 37 passed (includes 12 new)
|
||||||
|
- **Graph tests**: 12 passed (includes 10 new)
|
||||||
|
- **TypeScript**: Zero errors
|
||||||
|
- **Rust workspace**: Zero warnings
|
||||||
@@ -0,0 +1,57 @@
|
|||||||
|
# Pack Reinstallation: Preserve Ad-Hoc Rules
|
||||||
|
|
||||||
|
**Date**: 2026-02-05
|
||||||
|
|
||||||
|
## Problem
|
||||||
|
|
||||||
|
When reinstalling a pack (force=true), user-created (ad-hoc) rules belonging to that pack were being permanently deleted. This happened because the reinstallation flow performed a hard `PackRepository::delete()` before recreating the pack, and the `rule.pack` foreign key uses `ON DELETE CASCADE` — destroying all rules owned by the pack, including custom ones created by users through the API or UI.
|
||||||
|
|
||||||
|
Additionally, rules from *other* packs that referenced triggers or actions from the reinstalled pack would have their `action` and `trigger` FK columns set to `NULL` (via `ON DELETE SET NULL`) when the old pack's entities were cascade-deleted, but were never re-linked after the new entities were created.
|
||||||
|
|
||||||
|
## Root Cause
|
||||||
|
|
||||||
|
In `register_pack_internal()` (`crates/api/src/routes/packs.rs`), the force-reinstall path was:
|
||||||
|
|
||||||
|
```
|
||||||
|
1. Delete existing pack (CASCADE deletes ALL rules, actions, triggers, sensors, runtimes)
|
||||||
|
2. Create new pack + components
|
||||||
|
```
|
||||||
|
|
||||||
|
No distinction was made between pack-defined rules (`is_adhoc = false`) and user-created rules (`is_adhoc = true`).
|
||||||
|
|
||||||
|
## Solution
|
||||||
|
|
||||||
|
### Repository Changes (`crates/common/src/repositories/rule.rs`)
|
||||||
|
|
||||||
|
Added four new methods/types:
|
||||||
|
|
||||||
|
- **`RestoreRuleInput`** — Like `CreateRuleInput` but with `Option<Id>` for action and trigger, since referenced entities may not exist after reinstallation.
|
||||||
|
- **`find_adhoc_by_pack()`** — Queries ad-hoc rules (`is_adhoc = true`) belonging to a specific pack.
|
||||||
|
- **`restore_rule()`** — Inserts a rule with optional action/trigger FK IDs, always setting `is_adhoc = true`.
|
||||||
|
- **`relink_action_by_ref()` / `relink_trigger_by_ref()`** — Updates rules with NULL action/trigger FKs, matching by the text `_ref` field to re-establish the link.
|
||||||
|
|
||||||
|
### Pack Registration Changes (`crates/api/src/routes/packs.rs`)
|
||||||
|
|
||||||
|
Modified `register_pack_internal()` to add two phases after component loading:
|
||||||
|
|
||||||
|
**Phase 1 — Save & Restore Ad-Hoc Rules:**
|
||||||
|
- Before deleting the old pack, queries and saves all ad-hoc rules
|
||||||
|
- After the new pack and components are created, restores each saved rule with the new pack ID
|
||||||
|
- Resolves action/trigger FKs by looking up entities by ref; if not found, the rule is preserved with NULL FKs (non-functional but not lost)
|
||||||
|
|
||||||
|
**Phase 2 — Re-link Orphaned Rules from Other Packs:**
|
||||||
|
- Iterates over all newly created actions and triggers
|
||||||
|
- For each, updates any rules (from any pack) that have a matching `_ref` but a NULL FK
|
||||||
|
|
||||||
|
## Files Changed
|
||||||
|
|
||||||
|
| File | Change |
|
||||||
|
|------|--------|
|
||||||
|
| `crates/common/src/repositories/rule.rs` | Added `RestoreRuleInput`, `find_adhoc_by_pack()`, `restore_rule()`, `relink_action_by_ref()`, `relink_trigger_by_ref()` |
|
||||||
|
| `crates/api/src/routes/packs.rs` | Save ad-hoc rules before pack deletion; restore them and re-link orphaned cross-pack rules after component loading |
|
||||||
|
|
||||||
|
## Testing
|
||||||
|
|
||||||
|
- Zero compiler warnings across the workspace
|
||||||
|
- All unit tests pass
|
||||||
|
- Integration test failures are pre-existing (no `attune_test` database configured)
|
||||||
96
work-summary/2026-02-22-stackstorm-param-schema.md
Normal file
96
work-summary/2026-02-22-stackstorm-param-schema.md
Normal file
@@ -0,0 +1,96 @@
|
|||||||
|
# StackStorm-Style Parameter Schema Migration
|
||||||
|
|
||||||
|
**Date**: 2026-02-22
|
||||||
|
|
||||||
|
## Summary
|
||||||
|
|
||||||
|
Migrated `param_schema` format from standard JSON Schema to StackStorm-style flat parameter maps with `required` and `secret` inlined per-parameter. This makes parameter definitions more readable and eliminates the clunky top-level `required` array pattern from JSON Schema.
|
||||||
|
|
||||||
|
## Format Change
|
||||||
|
|
||||||
|
### Before (JSON Schema)
|
||||||
|
```yaml
|
||||||
|
parameters:
|
||||||
|
type: object
|
||||||
|
properties:
|
||||||
|
url:
|
||||||
|
type: string
|
||||||
|
description: "Target URL"
|
||||||
|
token:
|
||||||
|
type: string
|
||||||
|
secret: true
|
||||||
|
required:
|
||||||
|
- url
|
||||||
|
```
|
||||||
|
|
||||||
|
### After (StackStorm-style)
|
||||||
|
```yaml
|
||||||
|
parameters:
|
||||||
|
url:
|
||||||
|
type: string
|
||||||
|
description: "Target URL"
|
||||||
|
required: true
|
||||||
|
token:
|
||||||
|
type: string
|
||||||
|
secret: true
|
||||||
|
```
|
||||||
|
|
||||||
|
The `type: object` / `properties:` wrapper is removed. `required` moves from a top-level array to an inline boolean per-parameter. `secret` was already inline and remains unchanged.
|
||||||
|
|
||||||
|
## Scope
|
||||||
|
|
||||||
|
- **`param_schema`** (action, trigger, sensor, workflow parameters): Converted to StackStorm-style
|
||||||
|
- **`out_schema`** (output schemas): Left as standard JSON Schema — `required`/`secret` are not meaningful for outputs
|
||||||
|
- **Database**: No migration needed — columns are JSONB, the JSON shape just changes
|
||||||
|
- **Backward compatibility**: Web UI `extractProperties()` handles both formats during transition
|
||||||
|
|
||||||
|
## Files Modified
|
||||||
|
|
||||||
|
### Pack YAML Files (13 files)
|
||||||
|
- `packs/core/actions/echo.yaml`
|
||||||
|
- `packs/core/actions/sleep.yaml`
|
||||||
|
- `packs/core/actions/noop.yaml`
|
||||||
|
- `packs/core/actions/http_request.yaml`
|
||||||
|
- `packs/core/actions/download_packs.yaml`
|
||||||
|
- `packs/core/actions/register_packs.yaml`
|
||||||
|
- `packs/core/actions/build_pack_envs.yaml`
|
||||||
|
- `packs/core/actions/get_pack_dependencies.yaml`
|
||||||
|
- `packs/core/triggers/intervaltimer.yaml`
|
||||||
|
- `packs/core/triggers/crontimer.yaml`
|
||||||
|
- `packs/core/triggers/datetimetimer.yaml`
|
||||||
|
- `packs/core/workflows/install_packs.yaml`
|
||||||
|
- `packs/examples/actions/list_example.yaml`
|
||||||
|
|
||||||
|
### Web UI (7 files)
|
||||||
|
- `web/src/components/common/ParamSchemaForm.tsx` — New `ParamSchemaProperty` interface with inline `required`/`secret`/`position`, new exported `extractProperties()` utility, updated `validateParamSchema()` logic
|
||||||
|
- `web/src/components/common/ParamSchemaDisplay.tsx` — Imports shared `extractProperties`, removed duplicate type definitions
|
||||||
|
- `web/src/components/common/ExecuteActionModal.tsx` — Uses shared `extractProperties` for parameter initialization
|
||||||
|
- `web/src/components/common/SchemaBuilder.tsx` — Produces StackStorm-style flat format, added Secret checkbox, handles both formats on input
|
||||||
|
- `web/src/components/forms/TriggerForm.tsx` — Updated empty-schema check for flat format
|
||||||
|
- `web/src/pages/actions/ActionsPage.tsx` — Uses `extractProperties`, added Secret badges
|
||||||
|
- `web/src/pages/triggers/TriggersPage.tsx` — Uses `extractProperties`, added Secret badges
|
||||||
|
|
||||||
|
### API DTOs (3 files)
|
||||||
|
- `crates/api/src/dto/action.rs` — Updated OpenAPI examples and doc comments
|
||||||
|
- `crates/api/src/dto/trigger.rs` — Updated OpenAPI examples and doc comments
|
||||||
|
- `crates/api/src/dto/workflow.rs` — Updated OpenAPI examples and doc comments
|
||||||
|
|
||||||
|
### Documentation
|
||||||
|
- `AGENTS.md` — Added Parameter Schema Format documentation in Pack File Loading section
|
||||||
|
|
||||||
|
## Key Design Decisions
|
||||||
|
|
||||||
|
1. **Shared `extractProperties()` utility**: Single exported function in `ParamSchemaForm.tsx` handles both StackStorm-style and legacy JSON Schema formats. All consumers import from one place instead of duplicating logic.
|
||||||
|
|
||||||
|
2. **Backward compatibility in Web UI**: The `extractProperties()` function detects the old format (presence of `type: "object"` + `properties` wrapper) and normalizes it to the flat format, merging the top-level `required` array into per-parameter `required: true` flags. This means existing database records in the old format will still render correctly.
|
||||||
|
|
||||||
|
3. **No Rust model changes needed**: `param_schema` is stored as `Option<JsonValue>` (aliased as `JsonSchema`). The Rust code doesn't deeply inspect the schema structure — it passes it through as opaque JSONB. The format change is transparent to the backend.
|
||||||
|
|
||||||
|
4. **Pack loaders unchanged**: Both `loader.rs` and `load_core_pack.py` read `data.get("parameters")` and serialize it to JSONB as-is. Since we changed the YAML format, the stored format automatically changes to match.
|
||||||
|
|
||||||
|
## Verification
|
||||||
|
|
||||||
|
- Rust: `cargo check --all-targets --workspace` — zero warnings
|
||||||
|
- Rust: `cargo test --workspace --lib` — 82 tests passed
|
||||||
|
- TypeScript: `npx tsc --noEmit` — clean
|
||||||
|
- Vite: `npx vite build` — successful production build
|
||||||
93
work-summary/2026-02-23-workflow-builder-ui.md
Normal file
93
work-summary/2026-02-23-workflow-builder-ui.md
Normal file
@@ -0,0 +1,93 @@
|
|||||||
|
# Workflow Builder UI Implementation
|
||||||
|
|
||||||
|
**Date:** 2026-02-23
|
||||||
|
|
||||||
|
## Summary
|
||||||
|
|
||||||
|
Implemented a visual workflow builder interface for creating and editing workflow actions. The builder is accessible from the Actions page and provides a node-based canvas for constructing workflows using installed actions as task building blocks.
|
||||||
|
|
||||||
|
## Changes
|
||||||
|
|
||||||
|
### Frontend (Web UI)
|
||||||
|
|
||||||
|
#### New Pages
|
||||||
|
- **`web/src/pages/actions/WorkflowBuilderPage.tsx`** — Main workflow builder page with:
|
||||||
|
- Top toolbar with pack selector, workflow name/label/version inputs, save button
|
||||||
|
- Description, tags, and enabled toggle in a secondary row
|
||||||
|
- Three-panel layout: action palette (left), canvas (center), task inspector (right)
|
||||||
|
- Definition JSON preview panel (toggleable)
|
||||||
|
- Validation error display
|
||||||
|
- Support for both create (`/actions/workflows/new`) and edit (`/actions/workflows/:ref/edit`) modes
|
||||||
|
|
||||||
|
#### New Components (`web/src/components/workflows/`)
|
||||||
|
- **`ActionPalette.tsx`** — Searchable sidebar listing all available actions grouped by pack. Clicking an action adds it as a task to the canvas with auto-populated input parameters from the action's schema.
|
||||||
|
- **`WorkflowCanvas.tsx`** — Visual canvas with:
|
||||||
|
- Draggable task nodes with absolute positioning
|
||||||
|
- SVG edge rendering for task transitions
|
||||||
|
- Interactive connection mode: click a port on one node, then click another node to create success/failure transitions
|
||||||
|
- Grid background, empty state with guidance
|
||||||
|
- Floating "add task" button
|
||||||
|
- **`TaskNode.tsx`** — Individual task node component showing task name, action reference, input count, badges for conditions/retry/iteration, and connection/configure/delete action buttons
|
||||||
|
- **`WorkflowEdges.tsx`** — SVG overlay rendering curved bezier edges between connected nodes with color-coded and dash-styled lines per transition type (success=green, failure=red dashed, complete=indigo, timeout=amber, decision=violet). Includes arrow markers and edge labels.
|
||||||
|
- **`TaskInspector.tsx`** — Right-side property panel with collapsible sections for:
|
||||||
|
- Basic settings (name, type, condition)
|
||||||
|
- Action selection (dropdown of all actions) with auto-populate from schema
|
||||||
|
- Transitions (on_success, on_failure, on_complete, on_timeout dropdowns)
|
||||||
|
- Iteration (with_items, batch_size, concurrency)
|
||||||
|
- Retry & timeout configuration
|
||||||
|
- Publish variables (key=value pairs for workflow variable publishing)
|
||||||
|
|
||||||
|
#### New Types & Utilities (`web/src/types/workflow.ts`)
|
||||||
|
- TypeScript types for workflow builder state, tasks, edges, parameters, YAML definition format
|
||||||
|
- `builderStateToDefinition()` — Converts builder state to the YAML-compatible definition format
|
||||||
|
- `definitionToBuilderState()` — Converts existing workflow definitions back to builder state (for edit mode)
|
||||||
|
- `deriveEdges()` — Extracts visual edges from task transition properties
|
||||||
|
- `validateWorkflow()` — Client-side validation (name, label, version, pack, task names, action assignments, transition references)
|
||||||
|
- Utility functions: `generateTaskId()`, `createEmptyTask()`, `generateUniqueTaskName()`
|
||||||
|
|
||||||
|
#### New Hooks (`web/src/hooks/useWorkflows.ts`)
|
||||||
|
- `useWorkflows()` — List workflows with filtering
|
||||||
|
- `useWorkflow()` — Get single workflow by ref
|
||||||
|
- `useCreateWorkflow()` / `useUpdateWorkflow()` / `useDeleteWorkflow()` — Standard CRUD mutations
|
||||||
|
- `useSaveWorkflowFile()` — Calls `POST /api/v1/packs/{pack_ref}/workflow-files` to save workflow file to disk
|
||||||
|
- `useUpdateWorkflowFile()` — Calls `PUT /api/v1/workflows/{ref}/file` to update workflow file on disk
|
||||||
|
|
||||||
|
#### Modified Files
|
||||||
|
- **`web/src/pages/actions/ActionsPage.tsx`** — Added "Workflow" button in the header that navigates to `/actions/workflows/new`
|
||||||
|
- **`web/src/App.tsx`** — Added lazy-loaded routes for `WorkflowBuilderPage` at `/actions/workflows/new` and `/actions/workflows/:ref/edit`
|
||||||
|
|
||||||
|
### Backend (API)
|
||||||
|
|
||||||
|
#### New Endpoints
|
||||||
|
- **`POST /api/v1/packs/{pack_ref}/workflow-files`** — Saves a new workflow:
|
||||||
|
1. Validates the request and checks the pack exists
|
||||||
|
2. Checks for duplicate workflow ref
|
||||||
|
3. Writes `{name}.workflow.yaml` to `{packs_base_dir}/{pack_ref}/actions/workflows/`
|
||||||
|
4. Creates the `workflow_definition` record in the database
|
||||||
|
5. Returns the workflow response
|
||||||
|
|
||||||
|
- **`PUT /api/v1/workflows/{ref}/file`** — Updates an existing workflow:
|
||||||
|
1. Validates the request and finds the existing workflow
|
||||||
|
2. Overwrites the YAML file on disk
|
||||||
|
3. Updates the database record
|
||||||
|
4. Returns the updated workflow response
|
||||||
|
|
||||||
|
#### New DTO
|
||||||
|
- **`SaveWorkflowFileRequest`** in `crates/api/src/dto/workflow.rs` — Request body with name, label, description, version, pack_ref, definition (JSON), param_schema, out_schema, tags, enabled
|
||||||
|
|
||||||
|
#### Modified Files
|
||||||
|
- **`crates/api/src/routes/workflows.rs`** — Added `save_workflow_file`, `update_workflow_file` handlers and helper function `write_workflow_yaml`. Updated routes to include new endpoints. Added unit tests.
|
||||||
|
- **`crates/api/src/dto/workflow.rs`** — Added `SaveWorkflowFileRequest` DTO
|
||||||
|
|
||||||
|
## Workflow File Storage
|
||||||
|
|
||||||
|
Workflow files are saved to: `{packs_base_dir}/{pack_ref}/actions/workflows/{name}.workflow.yaml`
|
||||||
|
|
||||||
|
This is a new path (`actions/workflows/`) distinct from the existing `workflows/` directory used by the pack sync mechanism. The definition is serialized as YAML and simultaneously persisted to both disk and database.
|
||||||
|
|
||||||
|
## Testing
|
||||||
|
|
||||||
|
- All 89 existing unit tests pass
|
||||||
|
- 2 new unit tests added for `SaveWorkflowFileRequest` validation
|
||||||
|
- TypeScript compilation passes with zero errors from new code
|
||||||
|
- Rust workspace compilation passes with zero warnings
|
||||||
88
work-summary/2026-02-unified-schema-format.md
Normal file
88
work-summary/2026-02-unified-schema-format.md
Normal file
@@ -0,0 +1,88 @@
|
|||||||
|
# Unified Schema Format: Flat Format for All Schema Types
|
||||||
|
|
||||||
|
**Date**: 2026-02-05
|
||||||
|
|
||||||
|
## Summary
|
||||||
|
|
||||||
|
Unified all schema types (`param_schema`, `out_schema`, `conf_schema`) to use the same flat StackStorm-style format with inline `required` and `secret` per parameter. Previously, `param_schema` used flat format while `out_schema` and `conf_schema` used standard JSON Schema (`{ type: "object", properties: { ... }, required: [...] }`). This inconsistency prevented features like `secret` badges from working on output and configuration schemas.
|
||||||
|
|
||||||
|
## Motivation
|
||||||
|
|
||||||
|
- No reason for `conf_schema` and `out_schema` to use a different format than `param_schema`
|
||||||
|
- Users should be able to mark `secret` and `required` inline on any schema type
|
||||||
|
- Eliminates dual-format shim logic in the web UI (`extractProperties` backward compatibility branch)
|
||||||
|
- Project is pre-production — no data migration needed, just adjust configurations
|
||||||
|
|
||||||
|
## Changes
|
||||||
|
|
||||||
|
### Pack YAML Files (12 files)
|
||||||
|
|
||||||
|
Converted all top-level `type: object` + `properties` wrappers to flat format, moving `required` array entries inline:
|
||||||
|
|
||||||
|
- `packs/core/pack.yaml` — `conf_schema`
|
||||||
|
- `packs/examples/pack.yaml` — `conf_schema`
|
||||||
|
- `packs/core/sensors/interval_timer_sensor.yaml` — `parameters`
|
||||||
|
- `packs/core/triggers/intervaltimer.yaml` — `output`
|
||||||
|
- `packs/core/triggers/crontimer.yaml` — `output`
|
||||||
|
- `packs/core/triggers/datetimetimer.yaml` — `output`
|
||||||
|
- `packs/core/actions/http_request.yaml` — `output_schema`
|
||||||
|
- `packs/core/actions/build_pack_envs.yaml` — `output_schema`
|
||||||
|
- `packs/core/actions/download_packs.yaml` — `output_schema`
|
||||||
|
- `packs/core/actions/get_pack_dependencies.yaml` — `output_schema`
|
||||||
|
- `packs/core/actions/register_packs.yaml` — `output_schema`
|
||||||
|
- `packs/core/workflows/install_packs.yaml` — `output_schema`
|
||||||
|
- `packs/examples/actions/list_example.yaml` — `output_schema`
|
||||||
|
|
||||||
|
Nested structures (e.g., `items: { type: object, properties: { ... } }` within array parameters) remain unchanged — only the top-level wrapper was converted.
|
||||||
|
|
||||||
|
### Web UI (6 files)
|
||||||
|
|
||||||
|
- **`ParamSchemaForm.tsx`** — Removed legacy JSON Schema branch from `extractProperties()`. Removed `extractJsonSchemaProperties()` (no longer needed). Single `extractProperties()` handles all schema types.
|
||||||
|
- **`ParamSchemaDisplay.tsx`** — Updated doc comment, tightened `schema` prop type from `ParamSchema | any` to `ParamSchema`.
|
||||||
|
- **`SchemaBuilder.tsx`** — Removed legacy JSON Schema reading from both `useEffect` initializer and `handleRawJsonChange`. Only reads/writes flat format.
|
||||||
|
- **`PackForm.tsx`** — Updated `confSchema` initial state from JSON Schema to `{}`. Updated `hasSchemaProperties` check (no longer looks for `.properties` sub-key). Updated config sync logic, validation, schema examples (API/Database/Webhook examples now use flat format with `secret` and `required` inline), and `ParamSchemaForm` pass-through (passes `confSchema` directly instead of `confSchema.properties`).
|
||||||
|
- **`TriggerForm.tsx`** — Updated `paramSchema` and `outSchema` initial states from JSON Schema to `{}`.
|
||||||
|
- **`TriggersPage.tsx`** — Uses `extractProperties()` for both `param_schema` and `out_schema`.
|
||||||
|
|
||||||
|
### Backend Rust (5 files)
|
||||||
|
|
||||||
|
- **`crates/api/src/validation/params.rs`** — Added `flat_to_json_schema()` function that converts flat format to JSON Schema internally before passing to `jsonschema::Validator`. Updated `validate_trigger_params()` and `validate_action_params()` to call the converter. Converted all 29 test schemas from JSON Schema to flat format. Added 4 unit tests for `flat_to_json_schema()` and 1 test for secret field validation.
|
||||||
|
- **`crates/api/src/dto/action.rs`** — Updated `out_schema` doc comment and utoipa example.
|
||||||
|
- **`crates/api/src/dto/trigger.rs`** — Updated `out_schema` and sensor `param_schema` doc comments and utoipa examples.
|
||||||
|
- **`crates/api/src/dto/workflow.rs`** — Updated `out_schema` doc comment and utoipa example.
|
||||||
|
- **`crates/api/src/dto/pack.rs`** — Updated `conf_schema` doc comment and utoipa example.
|
||||||
|
- **`crates/api/src/dto/inquiry.rs`** — Updated `response_schema` doc comment and utoipa example.
|
||||||
|
|
||||||
|
### Documentation
|
||||||
|
|
||||||
|
- **`AGENTS.md`** — Updated "Parameter Schema Format" section to "Schema Format (Unified)", reflecting that all schema types now use the same flat format.
|
||||||
|
|
||||||
|
## Test Results
|
||||||
|
|
||||||
|
- All 29 backend validation tests pass (converted to flat format schemas)
|
||||||
|
- TypeScript compilation clean (zero errors)
|
||||||
|
- Rust workspace compilation clean (zero warnings)
|
||||||
|
|
||||||
|
## Format Reference
|
||||||
|
|
||||||
|
**Before** (JSON Schema for out_schema/conf_schema):
|
||||||
|
```yaml
|
||||||
|
output:
|
||||||
|
type: object
|
||||||
|
properties:
|
||||||
|
fired_at:
|
||||||
|
type: string
|
||||||
|
format: date-time
|
||||||
|
required:
|
||||||
|
- fired_at
|
||||||
|
```
|
||||||
|
|
||||||
|
**After** (unified flat format):
|
||||||
|
```yaml
|
||||||
|
output:
|
||||||
|
fired_at:
|
||||||
|
type: string
|
||||||
|
format: date-time
|
||||||
|
required: true
|
||||||
|
secret: false # optional, can mark outputs as secret too
|
||||||
|
```
|
||||||
Reference in New Issue
Block a user