working on sensors and rules

This commit is contained in:
2026-02-19 20:37:17 -06:00
parent a1b9b8d2b1
commit f9cfcf8f40
31 changed files with 1316 additions and 586 deletions

2
.gitignore vendored
View File

@@ -79,3 +79,5 @@ tests/pids/*
.env.docker .env.docker
docker-compose.override.yml docker-compose.override.yml
*.pid *.pid
packs.examples/

View File

@@ -222,7 +222,9 @@ Enforcement created → Execution scheduled → Worker executes Action
- **Action Script Resolution**: Worker constructs file paths as `{packs_base_dir}/{pack_ref}/actions/{entrypoint}` - **Action Script Resolution**: Worker constructs file paths as `{packs_base_dir}/{pack_ref}/actions/{entrypoint}`
- **Runtime YAML Loading**: Pack registration reads `runtimes/*.yaml` files and inserts them into the `runtime` table. Runtime refs use format `{pack_ref}.{name}` (e.g., `core.python`, `core.shell`). - **Runtime YAML Loading**: Pack registration reads `runtimes/*.yaml` files and inserts them into the `runtime` table. Runtime refs use format `{pack_ref}.{name}` (e.g., `core.python`, `core.shell`).
- **Runtime Selection**: Determined by action's runtime field (e.g., "Shell", "Python") - compared case-insensitively; when an explicit `runtime_name` is set in execution context, it is authoritative (no fallback to extension matching) - **Runtime Selection**: Determined by action's runtime field (e.g., "Shell", "Python") - compared case-insensitively; when an explicit `runtime_name` is set in execution context, it is authoritative (no fallback to extension matching)
- **Worker Runtime Loading**: Worker loads all runtimes from DB that have a non-empty `execution_config` (i.e., runtimes with an interpreter configured). Builtin runtimes (e.g., sensor runtime with empty config) are automatically skipped. - **Worker Runtime Loading**: Worker loads all runtimes from DB that have a non-empty `execution_config` (i.e., runtimes with an interpreter configured). Native runtimes (e.g., `core.native` with empty config) are automatically skipped since they execute binaries directly.
- **Native Runtime Detection**: Runtime detection is purely data-driven via `execution_config` in the runtime table. A runtime with empty `execution_config` (or empty `interpreter.binary`) is native — the entrypoint is executed directly without an interpreter. There is no special "builtin" runtime concept.
- **Sensor Runtime Assignment**: Sensors declare their `runner_type` in YAML (e.g., `python`, `native`). The pack loader resolves this to the correct runtime from the database. Default is `native` (compiled binary, no interpreter). Legacy values `standalone` and `builtin` map to `core.native`.
- **Runtime Environment Setup**: Worker creates isolated environments (virtualenvs, node_modules) on-demand at `{runtime_envs_dir}/{pack_ref}/{runtime_name}` before first execution; setup is idempotent - **Runtime Environment Setup**: Worker creates isolated environments (virtualenvs, node_modules) on-demand at `{runtime_envs_dir}/{pack_ref}/{runtime_name}` before first execution; setup is idempotent
- **Parameter Delivery**: Actions receive parameters via stdin as JSON (never environment variables) - **Parameter Delivery**: Actions receive parameters via stdin as JSON (never environment variables)
- **Output Format**: Actions declare output format (text/json/yaml) - json/yaml are parsed into execution.result JSONB - **Output Format**: Actions declare output format (text/json/yaml) - json/yaml are parsed into execution.result JSONB

View File

@@ -14,9 +14,11 @@ pub fn create_cors_layer(allowed_origins: Vec<String>) -> CorsLayer {
// Default development origins // Default development origins
vec![ vec![
"http://localhost:3000".to_string(), "http://localhost:3000".to_string(),
"http://localhost:3001".to_string(),
"http://localhost:5173".to_string(), "http://localhost:5173".to_string(),
"http://localhost:8080".to_string(), "http://localhost:8080".to_string(),
"http://127.0.0.1:3000".to_string(), "http://127.0.0.1:3000".to_string(),
"http://127.0.0.1:3001".to_string(),
"http://127.0.0.1:5173".to_string(), "http://127.0.0.1:5173".to_string(),
"http://127.0.0.1:8080".to_string(), "http://127.0.0.1:8080".to_string(),
] ]

View File

@@ -1,6 +1,9 @@
//! Parameter validation module //! Parameter validation module
//! //!
//! Validates trigger and action parameters against their declared JSON schemas. //! Validates trigger and action parameters against their declared JSON schemas.
//! Template-aware: values containing `{{ }}` template expressions are replaced
//! with schema-appropriate placeholders before validation, so template expressions
//! pass type checks while literal values are still validated normally.
use attune_common::models::{action::Action, trigger::Trigger}; use attune_common::models::{action::Action, trigger::Trigger};
use jsonschema::Validator; use jsonschema::Validator;
@@ -8,15 +11,167 @@ use serde_json::Value;
use crate::middleware::ApiError; use crate::middleware::ApiError;
/// Validate trigger parameters against the trigger's parameter schema /// Check if a JSON value is (or contains) a template expression.
fn is_template_expression(value: &Value) -> bool {
match value {
Value::String(s) => s.contains("{{") && s.contains("}}"),
_ => false,
}
}
/// Given a JSON Schema property definition, produce a placeholder value that
/// satisfies the schema's type constraint. This is used to replace template
/// expressions so that JSON Schema validation passes for the remaining
/// (non-template) parts of the parameters.
fn placeholder_for_schema(property_schema: &Value) -> Value {
// Handle anyOf / oneOf by picking the first variant
if let Some(any_of) = property_schema.get("anyOf").and_then(|v| v.as_array()) {
if let Some(first) = any_of.first() {
return placeholder_for_schema(first);
}
}
if let Some(one_of) = property_schema.get("oneOf").and_then(|v| v.as_array()) {
if let Some(first) = one_of.first() {
return placeholder_for_schema(first);
}
}
let type_value = property_schema.get("type").and_then(|t| t.as_str());
match type_value {
Some("integer") => {
// Use minimum if set, else default if set, else 0
if let Some(default) = property_schema.get("default") {
return default.clone();
}
if let Some(min) = property_schema.get("minimum").and_then(|v| v.as_i64()) {
return Value::Number(min.into());
}
Value::Number(0.into())
}
Some("number") => {
if let Some(default) = property_schema.get("default") {
return default.clone();
}
if let Some(min) = property_schema.get("minimum").and_then(|v| v.as_f64()) {
return serde_json::Number::from_f64(min)
.map(Value::Number)
.unwrap_or(Value::Number(0.into()));
}
serde_json::Number::from_f64(0.0)
.map(Value::Number)
.unwrap_or(Value::Number(0.into()))
}
Some("boolean") => {
if let Some(default) = property_schema.get("default") {
return default.clone();
}
Value::Bool(true)
}
Some("array") => {
if let Some(default) = property_schema.get("default") {
return default.clone();
}
Value::Array(vec![])
}
Some("object") => {
if let Some(default) = property_schema.get("default") {
return default.clone();
}
Value::Object(serde_json::Map::new())
}
Some("string") | None => {
// For enum fields, use the first valid value so enum validation passes
if let Some(enum_values) = property_schema.get("enum").and_then(|v| v.as_array()) {
if let Some(first) = enum_values.first() {
return first.clone();
}
}
if let Some(default) = property_schema.get("default") {
return default.clone();
}
Value::String("__template_placeholder__".to_string())
}
Some(_) => Value::Null,
}
}
/// Walk a parameters object and replace any template expression values with
/// schema-appropriate placeholders. Only replaces leaf values that match
/// `{{ ... }}`; non-template values are left untouched for normal validation.
///
/// `schema` should be the full JSON Schema object (with `properties`, `type`, etc).
fn replace_templates_with_placeholders(params: &Value, schema: &Value) -> Value {
match params {
Value::Object(map) => {
let properties = schema.get("properties").and_then(|p| p.as_object());
let mut result = serde_json::Map::new();
for (key, value) in map {
let prop_schema = properties.and_then(|p| p.get(key));
if is_template_expression(value) {
// Replace with a type-appropriate placeholder
if let Some(ps) = prop_schema {
result.insert(key.clone(), placeholder_for_schema(ps));
} else {
// No schema for this property — keep as string placeholder
result.insert(
key.clone(),
Value::String("__template_placeholder__".to_string()),
);
}
} else if value.is_object() {
// Recurse into nested objects
let empty_schema = Value::Object(serde_json::Map::new());
let nested_schema = prop_schema.unwrap_or(&empty_schema);
result.insert(
key.clone(),
replace_templates_with_placeholders(value, nested_schema),
);
} else if value.is_array() {
// Recurse into arrays — check each element
if let Some(arr) = value.as_array() {
let empty_items_schema = Value::Object(serde_json::Map::new());
let item_schema = prop_schema
.and_then(|ps| ps.get("items"))
.unwrap_or(&empty_items_schema);
let new_arr: Vec<Value> = arr
.iter()
.map(|item| {
if is_template_expression(item) {
placeholder_for_schema(item_schema)
} else if item.is_object() || item.is_array() {
replace_templates_with_placeholders(item, item_schema)
} else {
item.clone()
}
})
.collect();
result.insert(key.clone(), Value::Array(new_arr));
} else {
result.insert(key.clone(), value.clone());
}
} else {
result.insert(key.clone(), value.clone());
}
}
Value::Object(result)
}
other => other.clone(),
}
}
/// Validate trigger parameters against the trigger's parameter schema.
/// Template expressions (`{{ ... }}`) are accepted for any field type.
pub fn validate_trigger_params(trigger: &Trigger, params: &Value) -> Result<(), ApiError> { pub fn validate_trigger_params(trigger: &Trigger, params: &Value) -> Result<(), ApiError> {
// If no schema is defined, accept any parameters // If no schema is defined, accept any parameters
let Some(schema) = &trigger.param_schema else { let Some(schema) = &trigger.param_schema else {
return Ok(()); return Ok(());
}; };
// If parameters are empty object and schema exists, validate against schema // Replace template expressions with schema-appropriate placeholders
// (schema might allow empty object or have defaults) let sanitized = replace_templates_with_placeholders(params, schema);
// Compile the JSON schema // Compile the JSON schema
let compiled_schema = Validator::new(schema).map_err(|e| { let compiled_schema = Validator::new(schema).map_err(|e| {
@@ -26,9 +181,9 @@ pub fn validate_trigger_params(trigger: &Trigger, params: &Value) -> Result<(),
)) ))
})?; })?;
// Validate the parameters // Validate the sanitized parameters
let errors: Vec<String> = compiled_schema let errors: Vec<String> = compiled_schema
.iter_errors(params) .iter_errors(&sanitized)
.map(|e| { .map(|e| {
let path = e.instance_path().to_string(); let path = e.instance_path().to_string();
if path.is_empty() { if path.is_empty() {
@@ -50,13 +205,17 @@ pub fn validate_trigger_params(trigger: &Trigger, params: &Value) -> Result<(),
Ok(()) Ok(())
} }
/// Validate action parameters against the action's parameter schema /// Validate action parameters against the action's parameter schema.
/// Template expressions (`{{ ... }}`) are accepted for any field type.
pub fn validate_action_params(action: &Action, params: &Value) -> Result<(), ApiError> { pub fn validate_action_params(action: &Action, params: &Value) -> Result<(), ApiError> {
// If no schema is defined, accept any parameters // If no schema is defined, accept any parameters
let Some(schema) = &action.param_schema else { let Some(schema) = &action.param_schema else {
return Ok(()); return Ok(());
}; };
// Replace template expressions with schema-appropriate placeholders
let sanitized = replace_templates_with_placeholders(params, schema);
// Compile the JSON schema // Compile the JSON schema
let compiled_schema = Validator::new(schema).map_err(|e| { let compiled_schema = Validator::new(schema).map_err(|e| {
ApiError::InternalServerError(format!( ApiError::InternalServerError(format!(
@@ -65,9 +224,9 @@ pub fn validate_action_params(action: &Action, params: &Value) -> Result<(), Api
)) ))
})?; })?;
// Validate the parameters // Validate the sanitized parameters
let errors: Vec<String> = compiled_schema let errors: Vec<String> = compiled_schema
.iter_errors(params) .iter_errors(&sanitized)
.map(|e| { .map(|e| {
let path = e.instance_path().to_string(); let path = e.instance_path().to_string();
if path.is_empty() { if path.is_empty() {
@@ -94,9 +253,10 @@ mod tests {
use super::*; use super::*;
use serde_json::json; use serde_json::json;
#[test] // ── Helper builders ──────────────────────────────────────────────
fn test_validate_trigger_params_with_no_schema() {
let trigger = Trigger { fn make_trigger(schema: Option<Value>) -> Trigger {
Trigger {
id: 1, id: 1,
r#ref: "test.trigger".to_string(), r#ref: "test.trigger".to_string(),
pack: Some(1), pack: Some(1),
@@ -104,7 +264,7 @@ mod tests {
label: "Test Trigger".to_string(), label: "Test Trigger".to_string(),
description: None, description: None,
enabled: true, enabled: true,
param_schema: None, param_schema: schema,
out_schema: None, out_schema: None,
webhook_enabled: false, webhook_enabled: false,
webhook_key: None, webhook_key: None,
@@ -112,12 +272,43 @@ mod tests {
is_adhoc: false, is_adhoc: false,
created: chrono::Utc::now(), created: chrono::Utc::now(),
updated: chrono::Utc::now(), updated: chrono::Utc::now(),
}; }
}
fn make_action(schema: Option<Value>) -> Action {
Action {
id: 1,
r#ref: "test.action".to_string(),
pack: 1,
pack_ref: "test".to_string(),
label: "Test Action".to_string(),
description: "Test action".to_string(),
entrypoint: "test.sh".to_string(),
runtime: Some(1),
param_schema: schema,
out_schema: None,
is_workflow: false,
workflow_def: None,
is_adhoc: false,
parameter_delivery: attune_common::models::ParameterDelivery::default(),
parameter_format: attune_common::models::ParameterFormat::default(),
output_format: attune_common::models::OutputFormat::default(),
created: chrono::Utc::now(),
updated: chrono::Utc::now(),
}
}
// ── No schema ────────────────────────────────────────────────────
#[test]
fn test_validate_trigger_params_with_no_schema() {
let trigger = make_trigger(None);
let params = json!({ "any": "value" }); let params = json!({ "any": "value" });
assert!(validate_trigger_params(&trigger, &params).is_ok()); assert!(validate_trigger_params(&trigger, &params).is_ok());
} }
// ── Basic trigger validation (no templates) ──────────────────────
#[test] #[test]
fn test_validate_trigger_params_with_valid_params() { fn test_validate_trigger_params_with_valid_params() {
let schema = json!({ let schema = json!({
@@ -129,24 +320,7 @@ mod tests {
"required": ["unit", "delta"] "required": ["unit", "delta"]
}); });
let trigger = Trigger { let trigger = make_trigger(Some(schema));
id: 1,
r#ref: "test.trigger".to_string(),
pack: Some(1),
pack_ref: Some("test".to_string()),
label: "Test Trigger".to_string(),
description: None,
enabled: true,
param_schema: Some(schema),
out_schema: None,
webhook_enabled: false,
webhook_key: None,
webhook_config: None,
is_adhoc: false,
created: chrono::Utc::now(),
updated: chrono::Utc::now(),
};
let params = json!({ "unit": "seconds", "delta": 10 }); let params = json!({ "unit": "seconds", "delta": 10 });
assert!(validate_trigger_params(&trigger, &params).is_ok()); assert!(validate_trigger_params(&trigger, &params).is_ok());
} }
@@ -162,23 +336,7 @@ mod tests {
"required": ["unit", "delta"] "required": ["unit", "delta"]
}); });
let trigger = Trigger { let trigger = make_trigger(Some(schema));
id: 1,
r#ref: "test.trigger".to_string(),
pack: Some(1),
pack_ref: Some("test".to_string()),
label: "Test Trigger".to_string(),
description: None,
enabled: true,
param_schema: Some(schema),
out_schema: None,
webhook_enabled: false,
webhook_key: None,
webhook_config: None,
is_adhoc: false,
created: chrono::Utc::now(),
updated: chrono::Utc::now(),
};
// Missing required field 'delta' // Missing required field 'delta'
let params = json!({ "unit": "seconds" }); let params = json!({ "unit": "seconds" });
@@ -193,6 +351,8 @@ mod tests {
assert!(validate_trigger_params(&trigger, &params).is_err()); assert!(validate_trigger_params(&trigger, &params).is_err());
} }
// ── Basic action validation (no templates) ───────────────────────
#[test] #[test]
fn test_validate_action_params_with_valid_params() { fn test_validate_action_params_with_valid_params() {
let schema = json!({ let schema = json!({
@@ -203,27 +363,7 @@ mod tests {
"required": ["message"] "required": ["message"]
}); });
let action = Action { let action = make_action(Some(schema));
id: 1,
r#ref: "test.action".to_string(),
pack: 1,
pack_ref: "test".to_string(),
label: "Test Action".to_string(),
description: "Test action".to_string(),
entrypoint: "test.sh".to_string(),
runtime: Some(1),
param_schema: Some(schema),
out_schema: None,
is_workflow: false,
workflow_def: None,
is_adhoc: false,
parameter_delivery: attune_common::models::ParameterDelivery::default(),
parameter_format: attune_common::models::ParameterFormat::default(),
output_format: attune_common::models::OutputFormat::default(),
created: chrono::Utc::now(),
updated: chrono::Utc::now(),
};
let params = json!({ "message": "Hello, world!" }); let params = json!({ "message": "Hello, world!" });
assert!(validate_action_params(&action, &params).is_ok()); assert!(validate_action_params(&action, &params).is_ok());
} }
@@ -238,28 +378,327 @@ mod tests {
"required": ["message"] "required": ["message"]
}); });
let action = Action { let action = make_action(Some(schema));
id: 2,
r#ref: "test.action".to_string(),
pack: 1,
pack_ref: "test".to_string(),
label: "Test Action".to_string(),
description: "Test action".to_string(),
entrypoint: "test.sh".to_string(),
runtime: Some(1),
param_schema: Some(schema),
out_schema: None,
is_workflow: false,
workflow_def: None,
is_adhoc: false,
parameter_delivery: attune_common::models::ParameterDelivery::default(),
parameter_format: attune_common::models::ParameterFormat::default(),
output_format: attune_common::models::OutputFormat::default(),
created: chrono::Utc::now(),
updated: chrono::Utc::now(),
};
let params = json!({}); let params = json!({});
assert!(validate_action_params(&action, &params).is_err()); assert!(validate_action_params(&action, &params).is_err());
} }
// ── Template-aware validation ────────────────────────────────────
#[test]
fn test_template_in_integer_field_passes() {
let schema = json!({
"type": "object",
"properties": {
"counter": { "type": "integer" }
},
"required": ["counter"]
});
let action = make_action(Some(schema));
let params = json!({ "counter": "{{ event.payload.counter }}" });
assert!(validate_action_params(&action, &params).is_ok());
}
#[test]
fn test_template_in_boolean_field_passes() {
let schema = json!({
"type": "object",
"properties": {
"verbose": { "type": "boolean" }
},
"required": ["verbose"]
});
let action = make_action(Some(schema));
let params = json!({ "verbose": "{{ event.payload.debug }}" });
assert!(validate_action_params(&action, &params).is_ok());
}
#[test]
fn test_template_in_number_field_passes() {
let schema = json!({
"type": "object",
"properties": {
"threshold": { "type": "number", "minimum": 0.0 }
},
"required": ["threshold"]
});
let action = make_action(Some(schema));
let params = json!({ "threshold": "{{ event.payload.threshold }}" });
assert!(validate_action_params(&action, &params).is_ok());
}
#[test]
fn test_template_in_enum_field_passes() {
let schema = json!({
"type": "object",
"properties": {
"level": { "type": "string", "enum": ["info", "warn", "error"] }
},
"required": ["level"]
});
let action = make_action(Some(schema));
let params = json!({ "level": "{{ event.payload.severity }}" });
assert!(validate_action_params(&action, &params).is_ok());
}
#[test]
fn test_template_in_array_field_passes() {
let schema = json!({
"type": "object",
"properties": {
"recipients": { "type": "array", "items": { "type": "string" } }
},
"required": ["recipients"]
});
let action = make_action(Some(schema));
let params = json!({ "recipients": "{{ event.payload.emails }}" });
assert!(validate_action_params(&action, &params).is_ok());
}
#[test]
fn test_template_in_object_field_passes() {
let schema = json!({
"type": "object",
"properties": {
"metadata": { "type": "object" }
},
"required": ["metadata"]
});
let action = make_action(Some(schema));
let params = json!({ "metadata": "{{ event.payload.meta }}" });
assert!(validate_action_params(&action, &params).is_ok());
}
#[test]
fn test_mixed_template_and_literal_values() {
let schema = json!({
"type": "object",
"properties": {
"message": { "type": "string" },
"count": { "type": "integer" },
"verbose": { "type": "boolean" }
},
"required": ["message", "count", "verbose"]
});
let action = make_action(Some(schema));
// Mix of literal and template values
let params = json!({
"message": "Hello",
"count": "{{ event.payload.count }}",
"verbose": true
});
assert!(validate_action_params(&action, &params).is_ok());
}
#[test]
fn test_literal_values_still_validated() {
let schema = json!({
"type": "object",
"properties": {
"message": { "type": "string" },
"count": { "type": "integer" }
},
"required": ["message", "count"]
});
let action = make_action(Some(schema));
// Template for message is fine, but literal "not_a_number" for integer is not
let params = json!({
"message": "{{ event.payload.msg }}",
"count": "not_a_number"
});
assert!(validate_action_params(&action, &params).is_err());
}
#[test]
fn test_required_field_still_enforced_with_templates() {
let schema = json!({
"type": "object",
"properties": {
"message": { "type": "string" },
"count": { "type": "integer" }
},
"required": ["message", "count"]
});
let action = make_action(Some(schema));
// Only message provided (even as template), count is missing
let params = json!({ "message": "{{ event.payload.msg }}" });
assert!(validate_action_params(&action, &params).is_err());
}
#[test]
fn test_pack_config_template_passes() {
let schema = json!({
"type": "object",
"properties": {
"api_key": { "type": "string" },
"timeout": { "type": "integer" }
},
"required": ["api_key", "timeout"]
});
let action = make_action(Some(schema));
let params = json!({
"api_key": "{{ pack.config.api_key }}",
"timeout": "{{ pack.config.default_timeout }}"
});
assert!(validate_action_params(&action, &params).is_ok());
}
#[test]
fn test_system_template_passes() {
let schema = json!({
"type": "object",
"properties": {
"timestamp": { "type": "string" },
"rule_id": { "type": "integer" }
},
"required": ["timestamp", "rule_id"]
});
let action = make_action(Some(schema));
let params = json!({
"timestamp": "{{ system.timestamp }}",
"rule_id": "{{ system.rule.id }}"
});
assert!(validate_action_params(&action, &params).is_ok());
}
#[test]
fn test_trigger_params_template_aware() {
let schema = json!({
"type": "object",
"properties": {
"unit": { "type": "string", "enum": ["seconds", "minutes", "hours"] },
"delta": { "type": "integer", "minimum": 1 }
},
"required": ["unit", "delta"]
});
let trigger = make_trigger(Some(schema));
// Both fields as templates
let params = json!({
"unit": "{{ pack.config.timer_unit }}",
"delta": "{{ pack.config.timer_delta }}"
});
assert!(validate_trigger_params(&trigger, &params).is_ok());
}
// ── Placeholder generation ───────────────────────────────────────
#[test]
fn test_is_template_expression() {
assert!(is_template_expression(&json!("{{ event.payload.x }}")));
assert!(is_template_expression(&json!("{{ pack.config.key }}")));
assert!(is_template_expression(&json!(
"prefix {{ system.ts }} suffix"
)));
assert!(!is_template_expression(&json!("no braces here")));
assert!(!is_template_expression(&json!(42)));
assert!(!is_template_expression(&json!(true)));
assert!(!is_template_expression(&json!("{ single braces }")));
}
#[test]
fn test_placeholder_for_schema_types() {
assert_eq!(
placeholder_for_schema(&json!({"type": "integer"})),
json!(0)
);
assert_eq!(
placeholder_for_schema(&json!({"type": "number"})),
json!(0.0)
);
assert_eq!(
placeholder_for_schema(&json!({"type": "boolean"})),
json!(true)
);
assert_eq!(placeholder_for_schema(&json!({"type": "array"})), json!([]));
assert_eq!(
placeholder_for_schema(&json!({"type": "object"})),
json!({})
);
assert_eq!(
placeholder_for_schema(&json!({"type": "string"})),
json!("__template_placeholder__")
);
}
#[test]
fn test_placeholder_respects_enum() {
let schema = json!({"type": "string", "enum": ["a", "b", "c"]});
assert_eq!(placeholder_for_schema(&schema), json!("a"));
}
#[test]
fn test_placeholder_respects_default() {
let schema = json!({"type": "integer", "default": 42});
assert_eq!(placeholder_for_schema(&schema), json!(42));
}
#[test]
fn test_placeholder_respects_minimum() {
let schema = json!({"type": "integer", "minimum": 5});
assert_eq!(placeholder_for_schema(&schema), json!(5));
}
#[test]
fn test_nested_object_template_replacement() {
let schema = json!({
"type": "object",
"properties": {
"outer": {
"type": "object",
"properties": {
"inner_count": { "type": "integer" }
}
}
}
});
let params = json!({
"outer": {
"inner_count": "{{ event.payload.count }}"
}
});
let sanitized = replace_templates_with_placeholders(&params, &schema);
// The inner template should be replaced with an integer placeholder
assert!(sanitized["outer"]["inner_count"].is_number());
}
#[test]
fn test_array_element_template_replacement() {
let schema = json!({
"type": "object",
"properties": {
"tags": {
"type": "array",
"items": { "type": "string" }
}
}
});
let params = json!({
"tags": ["literal", "{{ event.payload.tag }}"]
});
let sanitized = replace_templates_with_placeholders(&params, &schema);
let tags = sanitized["tags"].as_array().unwrap();
assert_eq!(tags[0], "literal");
assert!(tags[1].is_string());
assert_ne!(tags[1], "{{ event.payload.tag }}");
}
} }

View File

@@ -464,7 +464,7 @@ pub mod runtime {
} }
fn default_interpreter_binary() -> String { fn default_interpreter_binary() -> String {
"/bin/sh".to_string() String::new()
} }
impl Default for InterpreterConfig { impl Default for InterpreterConfig {

View File

@@ -144,10 +144,7 @@ impl<'a> PackComponentLoader<'a> {
let runtime_ref = match data.get("ref").and_then(|v| v.as_str()) { let runtime_ref = match data.get("ref").and_then(|v| v.as_str()) {
Some(r) => r.to_string(), Some(r) => r.to_string(),
None => { None => {
let msg = format!( let msg = format!("Runtime YAML {} missing 'ref' field, skipping", filename);
"Runtime YAML {} missing 'ref' field, skipping",
filename
);
warn!("{}", msg); warn!("{}", msg);
result.warnings.push(msg); result.warnings.push(msg);
continue; continue;
@@ -155,9 +152,7 @@ impl<'a> PackComponentLoader<'a> {
}; };
// Check if runtime already exists // Check if runtime already exists
if let Some(existing) = if let Some(existing) = RuntimeRepository::find_by_ref(self.pool, &runtime_ref).await? {
RuntimeRepository::find_by_ref(self.pool, &runtime_ref).await?
{
info!( info!(
"Runtime '{}' already exists (ID: {}), skipping", "Runtime '{}' already exists (ID: {}), skipping",
runtime_ref, existing.id runtime_ref, existing.id
@@ -204,10 +199,7 @@ impl<'a> PackComponentLoader<'a> {
match RuntimeRepository::create(self.pool, input).await { match RuntimeRepository::create(self.pool, input).await {
Ok(rt) => { Ok(rt) => {
info!( info!("Created runtime '{}' (ID: {})", runtime_ref, rt.id);
"Created runtime '{}' (ID: {})",
runtime_ref, rt.id
);
result.runtimes_loaded += 1; result.runtimes_loaded += 1;
} }
Err(e) => { Err(e) => {
@@ -509,15 +501,19 @@ impl<'a> PackComponentLoader<'a> {
self.pack_ref self.pack_ref
); );
// Resolve sensor runtime
let sensor_runtime_id = self.resolve_runtime_id("builtin").await?;
let sensor_runtime_ref = "core.builtin".to_string();
for (filename, content) in &yaml_files { for (filename, content) in &yaml_files {
let data: serde_yaml_ng::Value = serde_yaml_ng::from_str(content).map_err(|e| { let data: serde_yaml_ng::Value = serde_yaml_ng::from_str(content).map_err(|e| {
Error::validation(format!("Failed to parse sensor YAML {}: {}", filename, e)) Error::validation(format!("Failed to parse sensor YAML {}: {}", filename, e))
})?; })?;
// Resolve sensor runtime from YAML runner_type field.
// Defaults to "native" if not specified (compiled binary, no interpreter).
let runner_type = data
.get("runner_type")
.and_then(|v| v.as_str())
.unwrap_or("native");
let (sensor_runtime_id, sensor_runtime_ref) = self.resolve_runtime(runner_type).await?;
let sensor_ref = match data.get("ref").and_then(|v| v.as_str()) { let sensor_ref = match data.get("ref").and_then(|v| v.as_str()) {
Some(r) => r.to_string(), Some(r) => r.to_string(),
None => { None => {
@@ -581,7 +577,7 @@ impl<'a> PackComponentLoader<'a> {
label, label,
description, description,
entrypoint, entrypoint,
runtime: sensor_runtime_id.unwrap_or(0), runtime: sensor_runtime_id,
runtime_ref: sensor_runtime_ref.clone(), runtime_ref: sensor_runtime_ref.clone(),
trigger: trigger_id.unwrap_or(0), trigger: trigger_id.unwrap_or(0),
trigger_ref: trigger_ref.unwrap_or_default(), trigger_ref: trigger_ref.unwrap_or_default(),
@@ -606,7 +602,7 @@ impl<'a> PackComponentLoader<'a> {
Ok(()) Ok(())
} }
/// Resolve a runtime ID from a runner type string (e.g., "shell", "python", "builtin"). /// Resolve a runtime ID from a runner type string (e.g., "shell", "python", "native").
/// ///
/// Looks up the runtime in the database by `core.{name}` ref pattern, /// Looks up the runtime in the database by `core.{name}` ref pattern,
/// then falls back to name-based lookup (case-insensitive). /// then falls back to name-based lookup (case-insensitive).
@@ -614,8 +610,20 @@ impl<'a> PackComponentLoader<'a> {
/// - "shell" -> "core.shell" /// - "shell" -> "core.shell"
/// - "python" -> "core.python" /// - "python" -> "core.python"
/// - "node" -> "core.nodejs" /// - "node" -> "core.nodejs"
/// - "builtin" -> "core.builtin" /// - "native" -> "core.native"
async fn resolve_runtime_id(&self, runner_type: &str) -> Result<Option<Id>> { async fn resolve_runtime_id(&self, runner_type: &str) -> Result<Option<Id>> {
let (id, _ref) = self.resolve_runtime(runner_type).await?;
if id == 0 {
Ok(None)
} else {
Ok(Some(id))
}
}
/// Map a runner_type string to a (runtime_id, runtime_ref) pair.
///
/// Returns `(0, "unknown")` when no matching runtime is found.
async fn resolve_runtime(&self, runner_type: &str) -> Result<(Id, String)> {
let runner_lower = runner_type.to_lowercase(); let runner_lower = runner_type.to_lowercase();
// Runtime refs use the format `{pack_ref}.{name}` (e.g., "core.python"). // Runtime refs use the format `{pack_ref}.{name}` (e.g., "core.python").
@@ -623,28 +631,27 @@ impl<'a> PackComponentLoader<'a> {
"shell" | "bash" | "sh" => vec!["core.shell"], "shell" | "bash" | "sh" => vec!["core.shell"],
"python" | "python3" => vec!["core.python"], "python" | "python3" => vec!["core.python"],
"node" | "nodejs" | "node.js" => vec!["core.nodejs"], "node" | "nodejs" | "node.js" => vec!["core.nodejs"],
"native" => vec!["core.native"], "native" | "builtin" | "standalone" => vec!["core.native"],
"builtin" => vec!["core.builtin"],
other => vec![other], other => vec![other],
}; };
for runtime_ref in &refs_to_try { for runtime_ref in &refs_to_try {
if let Some(runtime) = RuntimeRepository::find_by_ref(self.pool, runtime_ref).await? { if let Some(runtime) = RuntimeRepository::find_by_ref(self.pool, runtime_ref).await? {
return Ok(Some(runtime.id)); return Ok((runtime.id, runtime.r#ref));
} }
} }
// Fall back to name-based lookup (case-insensitive) // Fall back to name-based lookup (case-insensitive)
use crate::repositories::runtime::RuntimeRepository as RR; use crate::repositories::runtime::RuntimeRepository as RR;
if let Some(runtime) = RR::find_by_name(self.pool, &runner_lower).await? { if let Some(runtime) = RR::find_by_name(self.pool, &runner_lower).await? {
return Ok(Some(runtime.id)); return Ok((runtime.id, runtime.r#ref));
} }
warn!( warn!(
"Could not find runtime for runner_type '{}', action will have no runtime", "Could not find runtime for runner_type '{}', component will have no runtime",
runner_type runner_type
); );
Ok(None) Ok((0, "unknown".to_string()))
} }
/// Resolve the trigger reference and ID for a sensor. /// Resolve the trigger reference and ID for a sensor.

View File

@@ -146,7 +146,7 @@ impl RuntimeDetector {
/// Verify if a runtime is available on this system /// Verify if a runtime is available on this system
pub async fn verify_runtime_available(runtime: &Runtime) -> bool { pub async fn verify_runtime_available(runtime: &Runtime) -> bool {
// Check if runtime is always available (e.g., shell, native, builtin) // Check if runtime is always available (e.g., shell, native)
if let Some(verification) = runtime.distributions.get("verification") { if let Some(verification) = runtime.distributions.get("verification") {
if let Some(always_available) = verification.get("always_available") { if let Some(always_available) = verification.get("always_available") {
if always_available.as_bool() == Some(true) { if always_available.as_bool() == Some(true) {

View File

@@ -264,7 +264,7 @@ mod tests {
assert!(RefValidator::validate_runtime_ref("core.python").is_ok()); assert!(RefValidator::validate_runtime_ref("core.python").is_ok());
assert!(RefValidator::validate_runtime_ref("core.shell").is_ok()); assert!(RefValidator::validate_runtime_ref("core.shell").is_ok());
assert!(RefValidator::validate_runtime_ref("mypack.nodejs").is_ok()); assert!(RefValidator::validate_runtime_ref("mypack.nodejs").is_ok());
assert!(RefValidator::validate_runtime_ref("core.builtin").is_ok()); assert!(RefValidator::validate_runtime_ref("core.native").is_ok());
// Invalid formats // Invalid formats
assert!(RefValidator::validate_runtime_ref("core.action.webhook").is_err()); // 3-part no longer valid assert!(RefValidator::validate_runtime_ref("core.action.webhook").is_err()); // 3-part no longer valid

View File

@@ -17,6 +17,7 @@ use attune_common::{
}, },
repositories::{ repositories::{
event::{CreateEnforcementInput, EnforcementRepository, EventRepository}, event::{CreateEnforcementInput, EnforcementRepository, EventRepository},
pack::PackRepository,
rule::RuleRepository, rule::RuleRepository,
Create, FindById, List, Create, FindById, List,
}, },
@@ -191,7 +192,7 @@ impl EventProcessor {
.unwrap_or_else(|| serde_json::Map::new()); .unwrap_or_else(|| serde_json::Map::new());
// Resolve action parameters using the template resolver // Resolve action parameters using the template resolver
let resolved_params = Self::resolve_action_params(rule, event, &payload)?; let resolved_params = Self::resolve_action_params(pool, rule, event, &payload).await?;
let create_input = CreateEnforcementInput { let create_input = CreateEnforcementInput {
rule: Some(rule.id), rule: Some(rule.id),
@@ -354,7 +355,8 @@ impl EventProcessor {
/// Replaces `{{ event.payload.* }}`, `{{ event.id }}`, `{{ event.trigger }}`, /// Replaces `{{ event.payload.* }}`, `{{ event.id }}`, `{{ event.trigger }}`,
/// `{{ event.created }}`, `{{ pack.config.* }}`, and `{{ system.* }}` references /// `{{ event.created }}`, `{{ pack.config.* }}`, and `{{ system.* }}` references
/// in the rule's `action_params` with values from the event and context. /// in the rule's `action_params` with values from the event and context.
fn resolve_action_params( async fn resolve_action_params(
pool: &PgPool,
rule: &Rule, rule: &Rule,
event: &Event, event: &Event,
event_payload: &serde_json::Value, event_payload: &serde_json::Value,
@@ -366,11 +368,26 @@ impl EventProcessor {
return Ok(serde_json::Map::new()); return Ok(serde_json::Map::new());
} }
// Load pack config from database for pack.config.* resolution
let pack_config = match PackRepository::find_by_id(pool, rule.pack).await {
Ok(Some(pack)) => pack.config,
Ok(None) => {
warn!(
"Pack {} not found for rule {} — pack.config.* templates will resolve to null",
rule.pack, rule.r#ref
);
serde_json::json!({})
}
Err(e) => {
warn!("Failed to load pack {} for rule {}: {} — pack.config.* templates will resolve to null", rule.pack, rule.r#ref, e);
serde_json::json!({})
}
};
// Build template context from the event // Build template context from the event
let context = TemplateContext::new( let context = TemplateContext::new(
event_payload.clone(), event_payload.clone(),
// TODO: Load pack config from database for pack.config.* resolution pack_config,
serde_json::json!({}),
serde_json::json!({ serde_json::json!({
"timestamp": chrono::Utc::now().to_rfc3339(), "timestamp": chrono::Utc::now().to_rfc3339(),
"rule": { "rule": {

View File

@@ -12,7 +12,7 @@
use anyhow::{anyhow, Result}; use anyhow::{anyhow, Result};
use attune_common::models::{Id, Sensor, Trigger}; use attune_common::models::{Id, Sensor, Trigger};
use attune_common::repositories::{FindById, List}; use attune_common::repositories::{FindById, List, RuntimeRepository};
use sqlx::{PgPool, Row}; use sqlx::{PgPool, Row};
use std::collections::HashMap; use std::collections::HashMap;
@@ -38,6 +38,7 @@ struct SensorManagerInner {
sensors: Arc<RwLock<HashMap<Id, SensorInstance>>>, sensors: Arc<RwLock<HashMap<Id, SensorInstance>>>,
running: Arc<RwLock<bool>>, running: Arc<RwLock<bool>>,
packs_base_dir: String, packs_base_dir: String,
runtime_envs_dir: String,
api_client: ApiClient, api_client: ApiClient,
api_url: String, api_url: String,
mq_url: String, mq_url: String,
@@ -58,6 +59,10 @@ impl SensorManager {
let mq_url = std::env::var("ATTUNE_MQ_URL") let mq_url = std::env::var("ATTUNE_MQ_URL")
.unwrap_or_else(|_| "amqp://guest:guest@localhost:5672".to_string()); .unwrap_or_else(|_| "amqp://guest:guest@localhost:5672".to_string());
let runtime_envs_dir = std::env::var("ATTUNE_RUNTIME_ENVS_DIR")
.or_else(|_| std::env::var("ATTUNE__RUNTIME_ENVS_DIR"))
.unwrap_or_else(|_| "/opt/attune/runtime_envs".to_string());
// Create API client for token provisioning (no admin token - uses internal endpoint) // Create API client for token provisioning (no admin token - uses internal endpoint)
let api_client = ApiClient::new(api_url.clone(), None); let api_client = ApiClient::new(api_url.clone(), None);
@@ -67,6 +72,7 @@ impl SensorManager {
sensors: Arc::new(RwLock::new(HashMap::new())), sensors: Arc::new(RwLock::new(HashMap::new())),
running: Arc::new(RwLock::new(false)), running: Arc::new(RwLock::new(false)),
packs_base_dir, packs_base_dir,
runtime_envs_dir,
api_client, api_client,
api_url, api_url,
mq_url, mq_url,
@@ -212,9 +218,45 @@ impl SensorManager {
self.inner.packs_base_dir, pack_ref, sensor.entrypoint self.inner.packs_base_dir, pack_ref, sensor.entrypoint
); );
// Load the runtime to determine how to execute the sensor
let runtime = RuntimeRepository::find_by_id(&self.inner.db, sensor.runtime)
.await?
.ok_or_else(|| {
anyhow!(
"Runtime {} not found for sensor {}",
sensor.runtime,
sensor.r#ref
)
})?;
let exec_config = runtime.parsed_execution_config();
let rt_name = runtime.name.to_lowercase();
// Resolve the interpreter: check for a virtualenv/node_modules first,
// then fall back to the system interpreter.
let pack_dir = std::path::PathBuf::from(&self.inner.packs_base_dir).join(pack_ref);
let env_dir = std::path::PathBuf::from(&self.inner.runtime_envs_dir)
.join(pack_ref)
.join(&rt_name);
let env_dir_opt = if env_dir.exists() {
Some(env_dir.as_path())
} else {
None
};
// Determine whether we need an interpreter or can execute directly.
// Determine native vs interpreted purely from the runtime's execution_config.
// A native runtime (e.g., core.native) has no interpreter configured —
// its binary field is empty. Interpreted runtimes (Python, Node, etc.)
// declare their interpreter binary explicitly in execution_config.
let interpreter_binary = &exec_config.interpreter.binary;
let is_native = interpreter_binary.is_empty()
|| interpreter_binary == "native"
|| interpreter_binary == "none";
info!( info!(
"TRACE: Before fetching trigger instances for sensor {}", "Sensor {} runtime={} interpreter={} native={}",
sensor.r#ref sensor.r#ref, rt_name, interpreter_binary, is_native
); );
info!("Starting standalone sensor process: {}", sensor_script); info!("Starting standalone sensor process: {}", sensor_script);
@@ -245,9 +287,30 @@ impl SensorManager {
.map_err(|e| anyhow!("Failed to serialize trigger instances: {}", e))?; .map_err(|e| anyhow!("Failed to serialize trigger instances: {}", e))?;
info!("Trigger instances JSON: {}", trigger_instances_json); info!("Trigger instances JSON: {}", trigger_instances_json);
// Build the command: use the interpreter for non-native runtimes,
// execute the script directly for native binaries.
let mut cmd = if is_native {
Command::new(&sensor_script)
} else {
let resolved_interpreter =
exec_config.resolve_interpreter_with_env(&pack_dir, env_dir_opt);
info!(
"Using interpreter {} for sensor {}",
resolved_interpreter.display(),
sensor.r#ref
);
let mut c = Command::new(resolved_interpreter);
// Pass any extra interpreter args (e.g., -u for unbuffered Python)
for arg in &exec_config.interpreter.args {
c.arg(arg);
}
c.arg(&sensor_script);
c
};
// Start the standalone sensor with token and configuration // Start the standalone sensor with token and configuration
// Pass sensor ref (e.g., "core.interval_timer_sensor") for proper identification // Pass sensor ref (e.g., "core.interval_timer_sensor") for proper identification
let mut child = Command::new(&sensor_script) let mut child = cmd
.env("ATTUNE_API_URL", &self.inner.api_url) .env("ATTUNE_API_URL", &self.inner.api_url)
.env("ATTUNE_API_TOKEN", &token_response.token) .env("ATTUNE_API_TOKEN", &token_response.token)
.env("ATTUNE_SENSOR_ID", &sensor.id.to_string()) .env("ATTUNE_SENSOR_ID", &sensor.id.to_string())

View File

@@ -434,12 +434,18 @@ async fn process_runtime_for_pack(
/// ///
/// Returns `None` if the variable is not set (meaning all runtimes are accepted). /// Returns `None` if the variable is not set (meaning all runtimes are accepted).
pub fn runtime_filter_from_env() -> Option<Vec<String>> { pub fn runtime_filter_from_env() -> Option<Vec<String>> {
std::env::var("ATTUNE_WORKER_RUNTIMES").ok().map(|val| { std::env::var("ATTUNE_WORKER_RUNTIMES")
val.split(',') .ok()
.map(|s| s.trim().to_lowercase()) .map(|val| parse_runtime_filter(&val))
.filter(|s| !s.is_empty()) }
.collect()
}) /// Parse a comma-separated runtime filter string into a list of lowercase runtime names.
/// Empty entries are filtered out.
fn parse_runtime_filter(val: &str) -> Vec<String> {
val.split(',')
.map(|s| s.trim().to_lowercase())
.filter(|s| !s.is_empty())
.collect()
} }
#[cfg(test)] #[cfg(test)]
@@ -447,26 +453,21 @@ mod tests {
use super::*; use super::*;
#[test] #[test]
fn test_runtime_filter_from_env_not_set() { fn test_parse_runtime_filter_values() {
// When ATTUNE_WORKER_RUNTIMES is not set, filter should be None let filter = parse_runtime_filter("shell,Python, Node");
std::env::remove_var("ATTUNE_WORKER_RUNTIMES");
assert!(runtime_filter_from_env().is_none());
}
#[test]
fn test_runtime_filter_from_env_set() {
std::env::set_var("ATTUNE_WORKER_RUNTIMES", "shell,Python, Node");
let filter = runtime_filter_from_env().unwrap();
assert_eq!(filter, vec!["shell", "python", "node"]); assert_eq!(filter, vec!["shell", "python", "node"]);
std::env::remove_var("ATTUNE_WORKER_RUNTIMES");
} }
#[test] #[test]
fn test_runtime_filter_from_env_empty() { fn test_parse_runtime_filter_empty() {
std::env::set_var("ATTUNE_WORKER_RUNTIMES", ""); let filter = parse_runtime_filter("");
let filter = runtime_filter_from_env().unwrap();
assert!(filter.is_empty()); assert!(filter.is_empty());
std::env::remove_var("ATTUNE_WORKER_RUNTIMES"); }
#[test]
fn test_parse_runtime_filter_whitespace() {
let filter = parse_runtime_filter(" shell , , python ");
assert_eq!(filter, vec!["shell", "python"]);
} }
#[test] #[test]

View File

@@ -55,12 +55,20 @@ pub async fn execute_streaming(
let stdin_write_error = if let Some(mut stdin) = child.stdin.take() { let stdin_write_error = if let Some(mut stdin) = child.stdin.take() {
let mut error = None; let mut error = None;
// Write parameters first if using stdin delivery // Write parameters first if using stdin delivery.
// Skip empty/trivial content ("{}","","[]") to avoid polluting stdin
// before secrets — scripts that read secrets via readline() expect
// the secrets JSON as the first line.
let has_real_params = parameters_stdin
.map(|s| !matches!(s.trim(), "" | "{}" | "[]"))
.unwrap_or(false);
if let Some(params_data) = parameters_stdin { if let Some(params_data) = parameters_stdin {
if let Err(e) = stdin.write_all(params_data.as_bytes()).await { if has_real_params {
error = Some(format!("Failed to write parameters to stdin: {}", e)); if let Err(e) = stdin.write_all(params_data.as_bytes()).await {
} else if let Err(e) = stdin.write_all(b"\n---ATTUNE_PARAMS_END---\n").await { error = Some(format!("Failed to write parameters to stdin: {}", e));
error = Some(format!("Failed to write parameter delimiter: {}", e)); } else if let Err(e) = stdin.write_all(b"\n---ATTUNE_PARAMS_END---\n").await {
error = Some(format!("Failed to write parameter delimiter: {}", e));
}
} }
} }

View File

@@ -8,6 +8,7 @@ use super::{
RuntimeResult, RuntimeResult,
}; };
use async_trait::async_trait; use async_trait::async_trait;
use std::collections::HashMap;
use std::path::PathBuf; use std::path::PathBuf;
use std::process::Stdio; use std::process::Stdio;
use std::time::Instant; use std::time::Instant;
@@ -16,6 +17,15 @@ use tokio::process::Command;
use tokio::time::timeout; use tokio::time::timeout;
use tracing::{debug, info, warn}; use tracing::{debug, info, warn};
/// Escape a string for embedding inside a bash single-quoted string.
///
/// In single-quoted strings the only problematic character is `'` itself.
/// We close the current single-quote, insert an escaped single-quote, and
/// reopen: `'foo'\''bar'` → `foo'bar`.
fn bash_single_quote_escape(s: &str) -> String {
s.replace('\'', "'\\''")
}
/// Shell runtime for executing shell scripts and commands /// Shell runtime for executing shell scripts and commands
pub struct ShellRuntime { pub struct ShellRuntime {
/// Shell interpreter path (bash, sh, zsh, etc.) /// Shell interpreter path (bash, sh, zsh, etc.)
@@ -75,12 +85,20 @@ impl ShellRuntime {
let stdin_write_error = if let Some(mut stdin) = child.stdin.take() { let stdin_write_error = if let Some(mut stdin) = child.stdin.take() {
let mut error = None; let mut error = None;
// Write parameters first if using stdin delivery // Write parameters first if using stdin delivery.
// Skip empty/trivial content ("{}","","[]") to avoid polluting stdin
// before secrets — scripts that read secrets via readline() expect
// the secrets JSON as the first line.
let has_real_params = parameters_stdin
.map(|s| !matches!(s.trim(), "" | "{}" | "[]"))
.unwrap_or(false);
if let Some(params_data) = parameters_stdin { if let Some(params_data) = parameters_stdin {
if let Err(e) = stdin.write_all(params_data.as_bytes()).await { if has_real_params {
error = Some(format!("Failed to write parameters to stdin: {}", e)); if let Err(e) = stdin.write_all(params_data.as_bytes()).await {
} else if let Err(e) = stdin.write_all(b"\n---ATTUNE_PARAMS_END---\n").await { error = Some(format!("Failed to write parameters to stdin: {}", e));
error = Some(format!("Failed to write parameter delimiter: {}", e)); } else if let Err(e) = stdin.write_all(b"\n---ATTUNE_PARAMS_END---\n").await {
error = Some(format!("Failed to write parameter delimiter: {}", e));
}
} }
} }
@@ -300,7 +318,12 @@ impl ShellRuntime {
}) })
} }
/// Generate shell wrapper script that injects parameters as environment variables /// Generate shell wrapper script that injects parameters and secrets directly.
///
/// Secrets are embedded as bash associative-array entries at generation time
/// so the wrapper has **zero external runtime dependencies** (no Python, jq,
/// etc.). The generated script is written to a temp file by the caller so
/// that secrets never appear in `/proc/<pid>/cmdline`.
fn generate_wrapper_script(&self, context: &ExecutionContext) -> RuntimeResult<String> { fn generate_wrapper_script(&self, context: &ExecutionContext) -> RuntimeResult<String> {
let mut script = String::new(); let mut script = String::new();
@@ -308,25 +331,19 @@ impl ShellRuntime {
script.push_str("#!/bin/bash\n"); script.push_str("#!/bin/bash\n");
script.push_str("set -e\n\n"); // Exit on error script.push_str("set -e\n\n"); // Exit on error
// Read secrets from stdin and store in associative array // Populate secrets associative array directly from Rust — no stdin
script.push_str("# Read secrets from stdin (passed securely, not via environment)\n"); // reading, no JSON parsing, no external interpreters.
script.push_str("# Secrets (injected at generation time, not via environment)\n");
script.push_str("declare -A ATTUNE_SECRETS\n"); script.push_str("declare -A ATTUNE_SECRETS\n");
script.push_str("read -r ATTUNE_SECRETS_JSON\n"); for (key, value) in &context.secrets {
script.push_str("if [ -n \"$ATTUNE_SECRETS_JSON\" ]; then\n"); let escaped_key = bash_single_quote_escape(key);
script.push_str(" # Parse JSON secrets using Python (always available)\n"); let escaped_val = bash_single_quote_escape(value);
script.push_str(" eval \"$(echo \"$ATTUNE_SECRETS_JSON\" | python3 -c \"\n"); script.push_str(&format!(
script.push_str("import sys, json\n"); "ATTUNE_SECRETS['{}']='{}'\n",
script.push_str("try:\n"); escaped_key, escaped_val
script.push_str(" secrets = json.load(sys.stdin)\n"); ));
script.push_str(" for key, value in secrets.items():\n"); }
script.push_str(" # Escape single quotes in value\n"); script.push('\n');
script.push_str(
" safe_value = value.replace(\\\"'\\\", \\\"'\\\\\\\\\\\\\\\\'\\\") \n",
);
script.push_str(" print(f\\\"ATTUNE_SECRETS['{key}']='{safe_value}'\\\")\n");
script.push_str("except: pass\n");
script.push_str("\")\"\n");
script.push_str("fi\n\n");
// Helper function to get secrets // Helper function to get secrets
script.push_str("# Helper function to access secrets\n"); script.push_str("# Helper function to access secrets\n");
@@ -344,16 +361,17 @@ impl ShellRuntime {
serde_json::Value::Bool(b) => b.to_string(), serde_json::Value::Bool(b) => b.to_string(),
_ => serde_json::to_string(value)?, _ => serde_json::to_string(value)?,
}; };
let escaped = bash_single_quote_escape(&value_str);
// Export with PARAM_ prefix for consistency // Export with PARAM_ prefix for consistency
script.push_str(&format!( script.push_str(&format!(
"export PARAM_{}='{}'\n", "export PARAM_{}='{}'\n",
key.to_uppercase(), key.to_uppercase(),
value_str escaped
)); ));
// Also export without prefix for easier shell script writing // Also export without prefix for easier shell script writing
script.push_str(&format!("export {}='{}'\n", key, value_str)); script.push_str(&format!("export {}='{}'\n", key, escaped));
} }
script.push_str("\n"); script.push('\n');
// Add the action code // Add the action code
script.push_str("# Action code\n"); script.push_str("# Action code\n");
@@ -364,44 +382,6 @@ impl ShellRuntime {
Ok(script) Ok(script)
} }
/// Execute shell script directly
async fn execute_shell_code(
&self,
code: String,
secrets: &std::collections::HashMap<String, String>,
env: &std::collections::HashMap<String, String>,
parameters_stdin: Option<&str>,
timeout_secs: Option<u64>,
max_stdout_bytes: usize,
max_stderr_bytes: usize,
output_format: OutputFormat,
) -> RuntimeResult<ExecutionResult> {
debug!(
"Executing shell script with {} secrets (passed via stdin)",
secrets.len()
);
// Build command
let mut cmd = Command::new(&self.shell_path);
cmd.arg("-c").arg(&code);
// Add environment variables
for (key, value) in env {
cmd.env(key, value);
}
self.execute_with_streaming(
cmd,
secrets,
parameters_stdin,
timeout_secs,
max_stdout_bytes,
max_stderr_bytes,
output_format,
)
.await
}
/// Execute shell script from file /// Execute shell script from file
async fn execute_shell_file( async fn execute_shell_file(
&self, &self,
@@ -520,19 +500,42 @@ impl Runtime for ShellRuntime {
.await; .await;
} }
// Otherwise, generate wrapper script and execute // Otherwise, generate wrapper script and execute.
// Secrets and parameters are embedded directly in the wrapper script
// by generate_wrapper_script(), so we write it to a temp file (to keep
// secrets out of /proc/cmdline) and pass no secrets/params via stdin.
let script = self.generate_wrapper_script(&context)?; let script = self.generate_wrapper_script(&context)?;
self.execute_shell_code(
script, // Write wrapper to a temp file so secrets are not exposed in the
&context.secrets, // process command line (which would happen with `bash -c "..."`).
&env, let wrapper_dir = self.work_dir.join("wrappers");
parameters_stdin, tokio::fs::create_dir_all(&wrapper_dir).await.map_err(|e| {
context.timeout, RuntimeError::ExecutionFailed(format!("Failed to create wrapper directory: {}", e))
context.max_stdout_bytes, })?;
context.max_stderr_bytes, let wrapper_path = wrapper_dir.join(format!("wrapper_{}.sh", context.execution_id));
context.output_format, tokio::fs::write(&wrapper_path, &script)
) .await
.await .map_err(|e| {
RuntimeError::ExecutionFailed(format!("Failed to write wrapper script: {}", e))
})?;
let result = self
.execute_shell_file(
wrapper_path.clone(),
&HashMap::new(), // secrets are in the script, not stdin
&env,
None,
context.timeout,
context.max_stdout_bytes,
context.max_stderr_bytes,
context.output_format,
)
.await;
// Clean up wrapper file (best-effort)
let _ = tokio::fs::remove_file(&wrapper_path).await;
result
} }
async fn setup(&self) -> RuntimeResult<()> { async fn setup(&self) -> RuntimeResult<()> {
@@ -716,7 +719,6 @@ mod tests {
} }
#[tokio::test] #[tokio::test]
#[ignore = "Pre-existing failure - secrets not being passed correctly"]
async fn test_shell_runtime_with_secrets() { async fn test_shell_runtime_with_secrets() {
let runtime = ShellRuntime::new(); let runtime = ShellRuntime::new();

View File

@@ -157,8 +157,8 @@ impl WorkerService {
// Load runtimes from the database and create ProcessRuntime instances. // Load runtimes from the database and create ProcessRuntime instances.
// Each runtime row's `execution_config` JSONB drives how the ProcessRuntime // Each runtime row's `execution_config` JSONB drives how the ProcessRuntime
// invokes interpreters, manages environments, and installs dependencies. // invokes interpreters, manages environments, and installs dependencies.
// We skip runtimes with empty execution_config (e.g., the built-in sensor // We skip runtimes with empty execution_config (e.g., core.native) since
// runtime) since they have no interpreter and cannot execute as a process. // they execute binaries directly and don't need a ProcessRuntime wrapper.
match RuntimeRepository::list(&pool).await { match RuntimeRepository::list(&pool).await {
Ok(db_runtimes) => { Ok(db_runtimes) => {
let executable_runtimes: Vec<_> = db_runtimes let executable_runtimes: Vec<_> = db_runtimes

View File

@@ -90,7 +90,7 @@ services:
# Initialize builtin packs # Initialize builtin packs
# Copies pack files to shared volume and loads them into database # Copies pack files to shared volume and loads them into database
init-packs: init-packs:
image: python:3.11-alpine image: python:3.11-slim
container_name: attune-init-packs container_name: attune-init-packs
volumes: volumes:
- ./packs:/source/packs:ro - ./packs:/source/packs:ro

View File

@@ -1,6 +1,7 @@
#!/bin/sh #!/bin/sh
# Initialize builtin packs for Attune # Initialize builtin packs for Attune
# This script copies pack files to the shared volume and registers them in the database # This script copies pack files to the shared volume and registers them in the database
# Designed to run on python:3.11-slim (Debian-based) image
set -e set -e
@@ -32,20 +33,9 @@ echo -e "${BLUE}║ Attune Builtin Packs Initialization ║${NC}"
echo -e "${BLUE}╚════════════════════════════════════════════════╝${NC}" echo -e "${BLUE}╚════════════════════════════════════════════════╝${NC}"
echo "" echo ""
# Install system dependencies
echo -e "${YELLOW}${NC} Installing system dependencies..."
apk add --no-cache postgresql-client > /dev/null 2>&1
if [ $? -eq 0 ]; then
echo -e "${GREEN}${NC} System dependencies installed"
else
echo -e "${RED}${NC} Failed to install system dependencies"
exit 1
fi
# Install Python dependencies # Install Python dependencies
echo -e "${YELLOW}${NC} Installing Python dependencies..." echo -e "${YELLOW}${NC} Installing Python dependencies..."
pip install --quiet --no-cache-dir psycopg2-binary pyyaml 2>/dev/null if pip install --quiet --no-cache-dir psycopg2-binary pyyaml; then
if [ $? -eq 0 ]; then
echo -e "${GREEN}${NC} Python dependencies installed" echo -e "${GREEN}${NC} Python dependencies installed"
else else
echo -e "${RED}${NC} Failed to install Python dependencies" echo -e "${RED}${NC} Failed to install Python dependencies"
@@ -53,10 +43,17 @@ else
fi fi
echo "" echo ""
# Wait for database to be ready # Wait for database to be ready (using Python instead of psql to avoid needing postgresql-client)
echo -e "${YELLOW}${NC} Waiting for database to be ready..." echo -e "${YELLOW}${NC} Waiting for database to be ready..."
export PGPASSWORD="$DB_PASSWORD" until python3 -c "
until psql -h "$DB_HOST" -p "$DB_PORT" -U "$DB_USER" -d "$DB_NAME" -c '\q' 2>/dev/null; do import psycopg2, sys
try:
conn = psycopg2.connect(host='$DB_HOST', port=$DB_PORT, user='$DB_USER', password='$DB_PASSWORD', dbname='$DB_NAME', connect_timeout=3)
conn.close()
sys.exit(0)
except Exception:
sys.exit(1)
" 2>/dev/null; do
echo -e "${YELLOW} ...${NC} Database is unavailable - sleeping" echo -e "${YELLOW} ...${NC} Database is unavailable - sleeping"
sleep 2 sleep 2
done done
@@ -111,8 +108,7 @@ for pack_dir in "$SOURCE_PACKS_DIR"/*; do
if [ -d "$target_pack_dir" ]; then if [ -d "$target_pack_dir" ]; then
# Pack exists, update files to ensure we have latest (especially binaries) # Pack exists, update files to ensure we have latest (especially binaries)
echo -e "${YELLOW}${NC} Pack exists at: $target_pack_dir, updating files..." echo -e "${YELLOW}${NC} Pack exists at: $target_pack_dir, updating files..."
cp -rf "$pack_dir"/* "$target_pack_dir"/ if cp -rf "$pack_dir"/* "$target_pack_dir"/; then
if [ $? -eq 0 ]; then
echo -e "${GREEN}${NC} Updated pack files at: $target_pack_dir" echo -e "${GREEN}${NC} Updated pack files at: $target_pack_dir"
else else
echo -e "${RED}${NC} Failed to update pack" echo -e "${RED}${NC} Failed to update pack"
@@ -121,9 +117,7 @@ for pack_dir in "$SOURCE_PACKS_DIR"/*; do
else else
# Copy pack to target directory # Copy pack to target directory
echo -e "${YELLOW}${NC} Copying pack files..." echo -e "${YELLOW}${NC} Copying pack files..."
cp -r "$pack_dir" "$target_pack_dir" if cp -r "$pack_dir" "$target_pack_dir"; then
if [ $? -eq 0 ]; then
COPIED_COUNT=$((COPIED_COUNT + 1)) COPIED_COUNT=$((COPIED_COUNT + 1))
echo -e "${GREEN}${NC} Copied to: $target_pack_dir" echo -e "${GREEN}${NC} Copied to: $target_pack_dir"
else else

View File

@@ -18,8 +18,7 @@ Each runtime YAML file contains only the fields that are stored in the database:
- **python.yaml** - Python 3 runtime for actions and sensors - **python.yaml** - Python 3 runtime for actions and sensors
- **nodejs.yaml** - Node.js runtime for JavaScript-based actions and sensors - **nodejs.yaml** - Node.js runtime for JavaScript-based actions and sensors
- **shell.yaml** - Shell (bash/sh) runtime - always available - **shell.yaml** - Shell (bash/sh) runtime - always available
- **native.yaml** - Native compiled runtime (Rust, Go, C, etc.) - always available - **native.yaml** - Native compiled runtime (Rust, Go, C, etc.) - executes binaries directly without an interpreter
- **sensor_builtin.yaml** - Built-in sensor runtime for native Attune sensors
## Loading ## Loading

View File

@@ -1,7 +1,7 @@
ref: core.native ref: core.native
pack_ref: core pack_ref: core
name: Native name: Native
description: Native compiled runtime (Rust, Go, C, etc.) - always available description: Native compiled runtime (Rust, Go, C, etc.) - executes binaries directly without an interpreter
distributions: distributions:
verification: verification:
@@ -17,9 +17,4 @@ installation:
build_required: false build_required: false
system_native: true system_native: true
execution_config: execution_config: {}
interpreter:
binary: "/bin/sh"
args:
- "-c"
file_extension: null

View File

@@ -1,14 +0,0 @@
ref: core.builtin
pack_ref: core
name: Builtin
description: Built-in sensor runtime for native Attune sensors (timers, webhooks, etc.)
distributions:
verification:
always_available: true
check_required: false
type: builtin
installation:
method: builtin
included_with_service: true

View File

@@ -7,7 +7,7 @@ description: "Built-in sensor that monitors time and fires timer triggers (inter
enabled: true enabled: true
# Sensor runner type # Sensor runner type
runner_type: standalone runner_type: native
# Entry point for sensor execution # Entry point for sensor execution
entry_point: attune-core-timer-sensor entry_point: attune-core-timer-sensor

View File

@@ -442,12 +442,20 @@ class PackLoader:
sensor_ids = {} sensor_ids = {}
cursor = self.conn.cursor() cursor = self.conn.cursor()
# Look up sensor runtime from already-loaded runtimes # Runtime name mapping: runner_type values to core runtime refs
sensor_runtime_id = runtime_ids.get("builtin") or runtime_ids.get( runner_type_to_ref = {
"core.builtin" "native": "core.native",
) "standalone": "core.native",
if not sensor_runtime_id: "builtin": "core.native",
print(" ⚠ No sensor runtime found, sensors will have no runtime") "shell": "core.shell",
"bash": "core.shell",
"sh": "core.shell",
"python": "core.python",
"python3": "core.python",
"node": "core.nodejs",
"nodejs": "core.nodejs",
"node.js": "core.nodejs",
}
for yaml_file in sorted(sensors_dir.glob("*.yaml")): for yaml_file in sorted(sensors_dir.glob("*.yaml")):
sensor_data = self.load_yaml(yaml_file) sensor_data = self.load_yaml(yaml_file)
@@ -483,6 +491,20 @@ class PackLoader:
trigger_ref = f"{self.pack_ref}.{first_trigger}" trigger_ref = f"{self.pack_ref}.{first_trigger}"
trigger_id = trigger_ids.get(trigger_ref) trigger_id = trigger_ids.get(trigger_ref)
# Resolve sensor runtime from YAML runner_type field
# Defaults to "native" (compiled binary, no interpreter)
runner_type = sensor_data.get("runner_type", "native").lower()
runtime_ref = runner_type_to_ref.get(runner_type, runner_type)
# Look up runtime ID: try the mapped ref, then the raw runner_type
sensor_runtime_id = runtime_ids.get(runtime_ref)
if not sensor_runtime_id:
# Try looking up by the short name (e.g., "python" key in runtime_ids)
sensor_runtime_id = runtime_ids.get(runner_type)
if not sensor_runtime_id:
print(
f" ⚠ No runtime found for runner_type '{runner_type}' (ref: {runtime_ref}), sensor will have no runtime"
)
# Determine entrypoint # Determine entrypoint
entry_point = sensor_data.get("entry_point", "") entry_point = sensor_data.get("entry_point", "")
if not entry_point: if not entry_point:
@@ -521,7 +543,7 @@ class PackLoader:
description, description,
entry_point, entry_point,
sensor_runtime_id, sensor_runtime_id,
"core.builtin", runtime_ref,
trigger_id, trigger_id,
trigger_ref, trigger_ref,
enabled, enabled,

View File

@@ -48,21 +48,29 @@ BEGIN
updated = NOW() updated = NOW()
RETURNING id INTO v_action_runtime_id; RETURNING id INTO v_action_runtime_id;
-- Create built-in runtime for sensors (no execution_config = not executable by worker) -- Use the native runtime for sensors that are compiled binaries
INSERT INTO attune.runtime (ref, pack, pack_ref, name, description, distributions) SELECT id INTO v_sensor_runtime_id
VALUES ( FROM attune.runtime
'core.builtin', WHERE ref = 'core.native';
v_pack_id,
'core', -- If core.native doesn't exist yet (shouldn't happen), create it
'Builtin', IF v_sensor_runtime_id IS NULL THEN
'Built-in sensor runtime for native Attune sensors (timers, webhooks, etc.)', INSERT INTO attune.runtime (ref, pack, pack_ref, name, description, distributions, execution_config)
'{"verification": {"always_available": true, "check_required": false}, "type": "builtin"}'::jsonb VALUES (
) 'core.native',
ON CONFLICT (ref) DO UPDATE SET v_pack_id,
name = EXCLUDED.name, 'core',
description = EXCLUDED.description, 'Native',
updated = NOW() 'Native compiled runtime (Rust, Go, C, etc.) - executes binaries directly without an interpreter',
RETURNING id INTO v_sensor_runtime_id; '{"verification": {"always_available": true, "check_required": false}}'::jsonb,
'{}'::jsonb
)
ON CONFLICT (ref) DO UPDATE SET
name = EXCLUDED.name,
description = EXCLUDED.description,
updated = NOW()
RETURNING id INTO v_sensor_runtime_id;
END IF;
-- Create generic timer triggers (these define trigger types, not instances) -- Create generic timer triggers (these define trigger types, not instances)
@@ -366,9 +374,9 @@ BEGIN
'core', 'core',
'10 Second Timer Sensor', '10 Second Timer Sensor',
'Timer sensor that fires every 10 seconds', 'Timer sensor that fires every 10 seconds',
'builtin:interval_timer', 'attune-core-timer-sensor',
v_sensor_runtime_id, v_sensor_runtime_id,
'core.builtin', 'core.native',
v_intervaltimer_id, v_intervaltimer_id,
'core.intervaltimer', 'core.intervaltimer',
true, true,

View File

@@ -0,0 +1,14 @@
valkey.service - Advanced key-value store
Loaded: loaded (]8;;file://hp-probook-cachy/usr/lib/systemd/system/valkey.service\/usr/lib/systemd/system/valkey.service]8;;\; [0;1;38:5:185mdisabled; preset: [0;1;38:5:185mdisabled)
Active: inactive (dead)
Feb 10 20:42:44 hp-probook-cachy systemd[1]: Started Advanced key-value store.
Feb 19 13:50:06 hp-probook-cachy valkey-server[1154]: 1154:signal-handler (1771530606) Received SIGTERM scheduling shutdown...
Feb 19 13:50:06 hp-probook-cachy systemd[1]: Stopping Advanced key-value store...
Feb 19 13:50:06 hp-probook-cachy valkey-server[1154]: 1154:M 19 Feb 2026 13:50:06.871 * User requested shutdown...
Feb 19 13:50:06 hp-probook-cachy valkey-server[1154]: 1154:M 19 Feb 2026 13:50:06.871 * Saving the final RDB snapshot before exiting.
Feb 19 13:50:06 hp-probook-cachy valkey-server[1154]: 1154:M 19 Feb 2026 13:50:06.874 * DB saved on disk
Feb 19 13:50:06 hp-probook-cachy valkey-server[1154]: 1154:M 19 Feb 2026 13:50:06.874 # Valkey is now ready to exit, bye bye...
Feb 19 13:50:06 hp-probook-cachy systemd[1]: valkey.service: Deactivated successfully.
Feb 19 13:50:06 hp-probook-cachy systemd[1]: Stopped Advanced key-value store.
Feb 19 13:50:06 hp-probook-cachy systemd[1]: valkey.service: Consumed 3min 58.539s CPU time over 1d 15h 35min 51.539s wall clock time, 13.2M memory peak.

View File

@@ -32,12 +32,76 @@ interface ParamSchemaFormProps {
errors?: Record<string, string>; errors?: Record<string, string>;
disabled?: boolean; disabled?: boolean;
className?: string; className?: string;
/**
* When true, all inputs render as text fields that accept template expressions
* like {{ event.payload.field }}, {{ pack.config.key }}, {{ system.timestamp }}.
* Used in rule configuration where parameters may be dynamically resolved
* at enforcement time rather than set to literal values.
*/
allowTemplates?: boolean;
}
/**
* Check if a string value contains a template expression ({{ ... }})
*/
function isTemplateExpression(value: any): boolean {
return typeof value === "string" && /\{\{.*\}\}/.test(value);
}
/**
* Format a value for display in a text input.
* Non-string values (booleans, numbers, objects, arrays) are JSON-stringified
* so the user can edit them as text.
*/
function valueToString(value: any): string {
if (value === undefined || value === null) return "";
if (typeof value === "string") return value;
return JSON.stringify(value);
}
/**
* Attempt to parse a text input value back to the appropriate JS type.
* Template expressions are always kept as strings.
* Plain values are coerced to the schema type when possible.
*/
function parseTemplateValue(raw: string, type: string): any {
if (raw === "") return "";
// Template expressions stay as strings - resolved server-side
if (isTemplateExpression(raw)) return raw;
switch (type) {
case "boolean":
if (raw === "true") return true;
if (raw === "false") return false;
return raw; // keep as string if not a recognised literal
case "number":
if (!isNaN(Number(raw))) return parseFloat(raw);
return raw;
case "integer":
if (!isNaN(Number(raw)) && Number.isInteger(Number(raw)))
return parseInt(raw, 10);
return raw;
case "array":
case "object":
try {
return JSON.parse(raw);
} catch {
return raw;
}
default:
return raw;
}
} }
/** /**
* Dynamic form component that renders inputs based on a parameter schema. * Dynamic form component that renders inputs based on a parameter schema.
* Supports standard JSON Schema format with properties and required array. * Supports standard JSON Schema format with properties and required array.
* Supports string, number, integer, boolean, array, object, and enum types. * Supports string, number, integer, boolean, array, object, and enum types.
*
* When `allowTemplates` is enabled, every field renders as a text input that
* accepts Jinja2-style template expressions (e.g. {{ event.payload.x }}).
* This is essential for rule configuration, where parameter values may reference
* event payloads, pack configs, keys, or system variables.
*/ */
export default function ParamSchemaForm({ export default function ParamSchemaForm({
schema, schema,
@@ -46,6 +110,7 @@ export default function ParamSchemaForm({
errors = {}, errors = {},
disabled = false, disabled = false,
className = "", className = "",
allowTemplates = false,
}: ParamSchemaFormProps) { }: ParamSchemaFormProps) {
const [localErrors, setLocalErrors] = useState<Record<string, string>>({}); const [localErrors, setLocalErrors] = useState<Record<string, string>>({});
@@ -98,7 +163,71 @@ export default function ParamSchemaForm({
}; };
/** /**
* Render input field based on parameter type * Get a placeholder hint for template-mode inputs
*/
const getTemplatePlaceholder = (key: string, param: any): string => {
const type = param?.type || "string";
switch (type) {
case "boolean":
return `true, false, or {{ event.payload.${key} }}`;
case "number":
case "integer":
return `${type} value or {{ event.payload.${key} }}`;
case "array":
return `["a","b"] or {{ event.payload.${key} }}`;
case "object":
return `{"k":"v"} or {{ event.payload.${key} }}`;
default:
if (param?.enum && param.enum.length > 0) {
const options = param.enum.slice(0, 3).join(", ");
const suffix = param.enum.length > 3 ? ", ..." : "";
return `${options}${suffix} or {{ event.payload.${key} }}`;
}
return param?.description || `{{ event.payload.${key} }}`;
}
};
/**
* Render a template-mode text input for any parameter type
*/
const renderTemplateInput = (key: string, param: any) => {
const type = param?.type || "string";
const rawValue = values[key] ?? param?.default ?? "";
const isDisabled = disabled;
const displayValue = valueToString(rawValue);
// Use a textarea for complex types (array/object) to give more room
if (type === "array" || type === "object") {
return (
<textarea
value={displayValue}
onChange={(e) =>
handleInputChange(key, parseTemplateValue(e.target.value, type))
}
disabled={isDisabled}
rows={3}
className="w-full px-3 py-2 border border-gray-300 rounded-lg focus:ring-2 focus:ring-blue-500 focus:border-blue-500 font-mono text-sm disabled:bg-gray-100 disabled:cursor-not-allowed"
placeholder={getTemplatePlaceholder(key, param)}
/>
);
}
return (
<input
type="text"
value={displayValue}
onChange={(e) =>
handleInputChange(key, parseTemplateValue(e.target.value, type))
}
disabled={isDisabled}
className="w-full px-3 py-2 border border-gray-300 rounded-lg focus:ring-2 focus:ring-blue-500 focus:border-blue-500 disabled:bg-gray-100 disabled:cursor-not-allowed"
placeholder={getTemplatePlaceholder(key, param)}
/>
);
};
/**
* Render input field based on parameter type (standard mode)
*/ */
const renderInput = (key: string, param: any) => { const renderInput = (key: string, param: any) => {
const type = param?.type || "string"; const type = param?.type || "string";
@@ -249,6 +378,38 @@ export default function ParamSchemaForm({
} }
}; };
/**
* Render type hint badge and additional context for template-mode fields
*/
const renderTemplateHints = (_key: string, param: any) => {
const type = param?.type || "string";
const hints: string[] = [];
if (type === "boolean") {
hints.push("Accepts: true, false, or a template expression");
} else if (type === "number" || type === "integer") {
const parts = [`Accepts: ${type} value`];
if (param?.minimum !== undefined) parts.push(`min: ${param.minimum}`);
if (param?.maximum !== undefined) parts.push(`max: ${param.maximum}`);
hints.push(parts.join(", ") + ", or a template expression");
} else if (param?.enum && param.enum.length > 0) {
hints.push(`Options: ${param.enum.join(", ")}`);
hints.push("Also accepts a template expression");
}
if (hints.length === 0) return null;
return (
<div className="mt-1 space-y-0.5">
{hints.map((hint, i) => (
<p key={i} className="text-xs text-gray-500">
{hint}
</p>
))}
</div>
);
};
const paramEntries = Object.entries(properties); const paramEntries = Object.entries(properties);
if (paramEntries.length === 0) { if (paramEntries.length === 0) {
@@ -261,6 +422,26 @@ export default function ParamSchemaForm({
return ( return (
<div className={`space-y-4 ${className}`}> <div className={`space-y-4 ${className}`}>
{allowTemplates && (
<div className="px-3 py-2 bg-amber-50 border border-amber-200 rounded-lg">
<p className="text-xs text-amber-800">
<span className="font-semibold">Template expressions</span> are
supported. Use{" "}
<code className="px-1 py-0.5 bg-amber-100 rounded text-[11px]">
{"{{ event.payload.field }}"}
</code>
,{" "}
<code className="px-1 py-0.5 bg-amber-100 rounded text-[11px]">
{"{{ pack.config.key }}"}
</code>
, or{" "}
<code className="px-1 py-0.5 bg-amber-100 rounded text-[11px]">
{"{{ system.timestamp }}"}
</code>{" "}
to dynamically resolve values when the rule fires.
</p>
</div>
)}
{paramEntries.map(([key, param]) => ( {paramEntries.map(([key, param]) => (
<div key={key}> <div key={key}>
<label className="block mb-2"> <label className="block mb-2">
@@ -283,8 +464,19 @@ export default function ParamSchemaForm({
{param?.description && param?.type !== "boolean" && ( {param?.description && param?.type !== "boolean" && (
<p className="text-xs text-gray-600 mb-2">{param.description}</p> <p className="text-xs text-gray-600 mb-2">{param.description}</p>
)} )}
{/* For boolean in template mode, show description since there's no checkbox label */}
{param?.description &&
param?.type === "boolean" &&
allowTemplates && (
<p className="text-xs text-gray-600 mb-2">
{param.description}
</p>
)}
</label> </label>
{renderInput(key, param)} {allowTemplates
? renderTemplateInput(key, param)
: renderInput(key, param)}
{allowTemplates && renderTemplateHints(key, param)}
{allErrors[key] && ( {allErrors[key] && (
<p className="text-xs text-red-600 mt-1">{allErrors[key]}</p> <p className="text-xs text-red-600 mt-1">{allErrors[key]}</p>
)} )}
@@ -302,12 +494,16 @@ export default function ParamSchemaForm({
} }
/** /**
* Utility function to validate parameter values against a schema * Utility function to validate parameter values against a schema.
* Supports standard JSON Schema format * Supports standard JSON Schema format.
*
* When `allowTemplates` is true, template expressions ({{ ... }}) are
* accepted for any field type and skip type-specific validation.
*/ */
export function validateParamSchema( export function validateParamSchema(
schema: ParamSchema, schema: ParamSchema,
values: Record<string, any>, values: Record<string, any>,
allowTemplates: boolean = false,
): Record<string, string> { ): Record<string, string> {
const errors: Record<string, string> = {}; const errors: Record<string, string> = {};
const properties = schema.properties || {}; const properties = schema.properties || {};
@@ -333,12 +529,23 @@ export function validateParamSchema(
return; return;
} }
// Template expressions are always valid in template mode
if (allowTemplates && isTemplateExpression(value)) {
return;
}
const type = param?.type || "string"; const type = param?.type || "string";
switch (type) { switch (type) {
case "number": case "number":
case "integer": case "integer":
if (typeof value !== "number" && isNaN(Number(value))) { if (typeof value !== "number" && isNaN(Number(value))) {
if (allowTemplates) {
// In template mode, non-numeric strings that aren't templates
// are still allowed — the user might be mid-edit or using a
// non-standard expression format. Only warn on submission.
break;
}
errors[key] = `Must be a valid ${type}`; errors[key] = `Must be a valid ${type}`;
} else { } else {
const numValue = typeof value === "number" ? value : Number(value); const numValue = typeof value === "number" ? value : Number(value);
@@ -351,8 +558,23 @@ export function validateParamSchema(
} }
break; break;
case "boolean":
// In template mode, string values like "true"/"false" are fine
if (
allowTemplates &&
typeof value === "string" &&
(value === "true" || value === "false")
) {
break;
}
break;
case "array": case "array":
if (!Array.isArray(value)) { if (!Array.isArray(value)) {
if (allowTemplates && typeof value === "string") {
// In template mode, strings are acceptable (could be template or JSON)
break;
}
try { try {
JSON.parse(value); JSON.parse(value);
} catch { } catch {
@@ -363,6 +585,9 @@ export function validateParamSchema(
case "object": case "object":
if (typeof value !== "object" || Array.isArray(value)) { if (typeof value !== "object" || Array.isArray(value)) {
if (allowTemplates && typeof value === "string") {
break;
}
try { try {
const parsed = JSON.parse(value); const parsed = JSON.parse(value);
if (typeof parsed !== "object" || Array.isArray(parsed)) { if (typeof parsed !== "object" || Array.isArray(parsed)) {
@@ -392,8 +617,9 @@ export function validateParamSchema(
break; break;
} }
// Enum validation // Enum validation — skip in template mode (value may be a template expression
if (param?.enum && param.enum.length > 0) { // or a string that will be resolved at runtime)
if (!allowTemplates && param?.enum && param.enum.length > 0) {
if (!param.enum.includes(value)) { if (!param.enum.includes(value)) {
errors[key] = `Must be one of: ${param.enum.join(", ")}`; errors[key] = `Must be one of: ${param.enum.join(", ")}`;
} }

View File

@@ -144,17 +144,19 @@ export default function RuleForm({ rule, onSuccess, onCancel }: RuleFormProps) {
} }
} }
// Validate trigger parameters // Validate trigger parameters (allow templates in rule context)
const triggerErrors = validateParamSchema( const triggerErrors = validateParamSchema(
triggerParamSchema, triggerParamSchema,
triggerParameters, triggerParameters,
true,
); );
setTriggerParamErrors(triggerErrors); setTriggerParamErrors(triggerErrors);
// Validate action parameters // Validate action parameters (allow templates in rule context)
const actionErrors = validateParamSchema( const actionErrors = validateParamSchema(
actionParamSchema, actionParamSchema,
actionParameters, actionParameters,
true,
); );
setActionParamErrors(actionErrors); setActionParamErrors(actionErrors);
@@ -428,6 +430,7 @@ export default function RuleForm({ rule, onSuccess, onCancel }: RuleFormProps) {
values={triggerParameters} values={triggerParameters}
onChange={setTriggerParameters} onChange={setTriggerParameters}
errors={triggerParamErrors} errors={triggerParamErrors}
allowTemplates
/> />
</div> </div>
)} )}
@@ -517,6 +520,7 @@ export default function RuleForm({ rule, onSuccess, onCancel }: RuleFormProps) {
values={actionParameters} values={actionParameters}
onChange={setActionParameters} onChange={setActionParameters}
errors={actionParamErrors} errors={actionParamErrors}
allowTemplates
/> />
</div> </div>
)} )}

View File

@@ -1,51 +1,5 @@
import { useQuery, useMutation, useQueryClient } from "@tanstack/react-query"; import { useQuery, useMutation, useQueryClient } from "@tanstack/react-query";
import { PacksService, ApiError } from "@/api";
// Temporary types until API client is regenerated
interface PackTestResult {
pack_ref: string;
pack_version: string;
execution_time: string;
status: string;
total_tests: number;
passed: number;
failed: number;
skipped: number;
pass_rate: number;
duration_ms: number;
test_suites: any[];
}
interface PackTestExecution {
id: number;
pack_id: number;
pack_version: string;
execution_time: string;
trigger_reason: string;
total_tests: number;
passed: number;
failed: number;
skipped: number;
pass_rate: number;
duration_ms: number;
result: PackTestResult;
created: string;
}
interface PackTestHistoryResponse {
data: {
items: PackTestExecution[];
meta: {
page: number;
page_size: number;
total_items: number;
total_pages: number;
};
};
}
interface PackTestLatestResponse {
data: PackTestExecution | null;
}
// Fetch test history for a pack // Fetch test history for a pack
export function usePackTestHistory( export function usePackTestHistory(
@@ -54,27 +8,12 @@ export function usePackTestHistory(
) { ) {
return useQuery({ return useQuery({
queryKey: ["pack-tests", packRef, params], queryKey: ["pack-tests", packRef, params],
queryFn: async (): Promise<PackTestHistoryResponse> => { queryFn: async () => {
const queryParams = new URLSearchParams(); return PacksService.getPackTestHistory({
if (params?.page) queryParams.append("page", params.page.toString()); ref: packRef,
if (params?.pageSize) page: params?.page,
queryParams.append("page_size", params.pageSize.toString()); pageSize: params?.pageSize,
});
const token = localStorage.getItem("access_token");
const response = await fetch(
`http://localhost:8080/api/v1/packs/${packRef}/tests?${queryParams}`,
{
headers: {
Authorization: `Bearer ${token}`,
},
},
);
if (!response.ok) {
throw new Error(`Failed to fetch test history: ${response.statusText}`);
}
return response.json();
}, },
enabled: !!packRef, enabled: !!packRef,
staleTime: 30000, // 30 seconds staleTime: 30000, // 30 seconds
@@ -85,25 +24,15 @@ export function usePackTestHistory(
export function usePackLatestTest(packRef: string) { export function usePackLatestTest(packRef: string) {
return useQuery({ return useQuery({
queryKey: ["pack-tests", packRef, "latest"], queryKey: ["pack-tests", packRef, "latest"],
queryFn: async (): Promise<PackTestLatestResponse> => { queryFn: async () => {
const token = localStorage.getItem("access_token"); try {
const response = await fetch( return await PacksService.getPackLatestTest({ ref: packRef });
`http://localhost:8080/api/v1/packs/${packRef}/tests/latest`, } catch (error) {
{ if (error instanceof ApiError && error.status === 404) {
headers: {
Authorization: `Bearer ${token}`,
},
},
);
if (!response.ok) {
if (response.status === 404) {
return { data: null }; return { data: null };
} }
throw new Error(`Failed to fetch latest test: ${response.statusText}`); throw error;
} }
return response.json();
}, },
enabled: !!packRef, enabled: !!packRef,
staleTime: 30000, staleTime: 30000,
@@ -115,27 +44,8 @@ export function useExecutePackTests() {
const queryClient = useQueryClient(); const queryClient = useQueryClient();
return useMutation({ return useMutation({
mutationFn: async (packRef: string): Promise<{ data: PackTestResult }> => { mutationFn: async (packRef: string) => {
const token = localStorage.getItem("access_token"); return PacksService.testPack({ ref: packRef });
const response = await fetch(
`http://localhost:8080/api/v1/packs/${packRef}/test`,
{
method: "POST",
headers: {
Authorization: `Bearer ${token}`,
"Content-Type": "application/json",
},
},
);
if (!response.ok) {
const error = await response.json().catch(() => ({}));
throw new Error(
error.error || `Failed to execute tests: ${response.statusText}`,
);
}
return response.json();
}, },
onSuccess: (_, packRef) => { onSuccess: (_, packRef) => {
// Invalidate test history and latest test queries // Invalidate test history and latest test queries
@@ -157,38 +67,14 @@ export function useRegisterPack() {
path: string; path: string;
force?: boolean; force?: boolean;
skipTests?: boolean; skipTests?: boolean;
}): Promise<{ }) => {
data: { return PacksService.registerPack({
pack: any; requestBody: {
test_result: PackTestResult | null; path,
tests_skipped: boolean; force,
}; skip_tests: skipTests,
}> => {
const token = localStorage.getItem("access_token");
const response = await fetch(
"http://localhost:8080/api/v1/packs/register",
{
method: "POST",
headers: {
Authorization: `Bearer ${token}`,
"Content-Type": "application/json",
},
body: JSON.stringify({
path,
force,
skip_tests: skipTests,
}),
}, },
); });
if (!response.ok) {
const error = await response.json().catch(() => ({}));
throw new Error(
error.error || `Failed to register pack: ${response.statusText}`,
);
}
return response.json();
}, },
onSuccess: (data) => { onSuccess: (data) => {
// Invalidate packs list and test queries // Invalidate packs list and test queries
@@ -219,40 +105,16 @@ export function useInstallPack() {
force?: boolean; force?: boolean;
skipTests?: boolean; skipTests?: boolean;
skipDeps?: boolean; skipDeps?: boolean;
}): Promise<{ }) => {
data: { return PacksService.installPack({
pack: any; requestBody: {
test_result: PackTestResult | null; source,
tests_skipped: boolean; ref_spec: refSpec,
}; force,
}> => { skip_tests: skipTests,
const token = localStorage.getItem("access_token"); skip_deps: skipDeps,
const response = await fetch(
"http://localhost:8080/api/v1/packs/install",
{
method: "POST",
headers: {
Authorization: `Bearer ${token}`,
"Content-Type": "application/json",
},
body: JSON.stringify({
source,
ref_spec: refSpec,
force,
skip_tests: skipTests,
skip_deps: skipDeps,
}),
}, },
); });
if (!response.ok) {
const error = await response.json().catch(() => ({}));
throw new Error(
error.error || `Failed to install pack: ${response.statusText}`,
);
}
return response.json();
}, },
onSuccess: (data) => { onSuccess: (data) => {
// Invalidate packs list and test queries // Invalidate packs list and test queries

View File

@@ -6,6 +6,30 @@ import axios, {
const API_BASE_URL = import.meta.env.VITE_API_BASE_URL || ""; const API_BASE_URL = import.meta.env.VITE_API_BASE_URL || "";
// A bare axios instance with NO interceptors, used exclusively for token refresh
// requests. This prevents infinite loops when the refresh endpoint returns 401.
const refreshClient = axios.create({
baseURL: API_BASE_URL || undefined,
timeout: 10000,
headers: { "Content-Type": "application/json" },
});
function getRefreshUrl(): string {
return API_BASE_URL ? `${API_BASE_URL}/auth/refresh` : "/auth/refresh";
}
// Clear auth state and redirect to the login page.
function clearSessionAndRedirect(): void {
localStorage.removeItem("access_token");
localStorage.removeItem("refresh_token");
const currentPath = window.location.pathname;
if (currentPath !== "/login") {
sessionStorage.setItem("redirect_after_login", currentPath);
window.location.href = "/login";
}
}
// Create axios instance // Create axios instance
export const apiClient: AxiosInstance = axios.create({ export const apiClient: AxiosInstance = axios.create({
baseURL: API_BASE_URL, baseURL: API_BASE_URL,
@@ -37,7 +61,7 @@ apiClient.interceptors.response.use(
_retry?: boolean; _retry?: boolean;
}; };
// Handle 401 Unauthorized - token expired or invalid // Handle 401 Unauthorized token expired or invalid
if (error.response?.status === 401 && !originalRequest._retry) { if (error.response?.status === 401 && !originalRequest._retry) {
originalRequest._retry = true; originalRequest._retry = true;
@@ -48,11 +72,8 @@ apiClient.interceptors.response.use(
throw new Error("No refresh token available"); throw new Error("No refresh token available");
} }
// Attempt token refresh // Use the bare refreshClient (no interceptors) to avoid infinite loops
const refreshUrl = API_BASE_URL const response = await refreshClient.post(getRefreshUrl(), {
? `${API_BASE_URL}/auth/refresh`
: "/auth/refresh";
const response = await axios.post(refreshUrl, {
refresh_token: refreshToken, refresh_token: refreshToken,
}); });
@@ -75,23 +96,13 @@ apiClient.interceptors.response.use(
console.error( console.error(
"Token refresh failed, clearing session and redirecting to login", "Token refresh failed, clearing session and redirecting to login",
); );
localStorage.removeItem("access_token"); clearSessionAndRedirect();
localStorage.removeItem("refresh_token");
// Store the current path so we can redirect back after login
const currentPath = window.location.pathname;
if (currentPath !== "/login") {
sessionStorage.setItem("redirect_after_login", currentPath);
}
window.location.href = "/login";
return Promise.reject(refreshError); return Promise.reject(refreshError);
} }
} }
// Handle 403 Forbidden - valid token but insufficient permissions // Handle 403 Forbidden - valid token but insufficient permissions
if (error.response?.status === 403) { if (error.response?.status === 403) {
// Enhance error message to distinguish from 401
const enhancedError = error as AxiosError & { const enhancedError = error as AxiosError & {
isAuthorizationError?: boolean; isAuthorizationError?: boolean;
}; };

View File

@@ -13,8 +13,39 @@ import axios from "axios";
* Strategy: * Strategy:
* Since the generated API client creates its own axios instances, we configure * Since the generated API client creates its own axios instances, we configure
* axios defaults globally and ensure the OpenAPI client uses our configured instance. * axios defaults globally and ensure the OpenAPI client uses our configured instance.
*
* IMPORTANT: All refresh calls use `refreshClient` — a bare axios instance with
* NO interceptors — to prevent infinite 401 retry loops when the refresh token
* itself is expired or invalid.
*/ */
const API_BASE_URL = import.meta.env.VITE_API_BASE_URL || "";
// A bare axios instance with NO interceptors, used exclusively for token refresh
// requests. This prevents infinite loops when the refresh endpoint returns 401.
const refreshClient = axios.create({
baseURL: API_BASE_URL || undefined,
timeout: 10000,
headers: { "Content-Type": "application/json" },
});
function getRefreshUrl(): string {
return API_BASE_URL ? `${API_BASE_URL}/auth/refresh` : "/auth/refresh";
}
// Clear auth state and redirect to the login page.
// Safe to call multiple times — only the first redirect takes effect.
function clearSessionAndRedirect(): void {
localStorage.removeItem("access_token");
localStorage.removeItem("refresh_token");
const currentPath = window.location.pathname;
if (currentPath !== "/login") {
sessionStorage.setItem("redirect_after_login", currentPath);
window.location.href = "/login";
}
}
// Helper to decode JWT and check if it's expired or about to expire // Helper to decode JWT and check if it's expired or about to expire
export function isTokenExpiringSoon( export function isTokenExpiringSoon(
token: string, token: string,
@@ -59,6 +90,39 @@ export function isTokenExpired(token: string): boolean {
} }
} }
// Attempt to refresh the access token using the refresh token.
// Returns true on success, false on failure.
// On failure, clears session and redirects to login.
async function attemptTokenRefresh(): Promise<boolean> {
const currentRefreshToken = localStorage.getItem("refresh_token");
if (!currentRefreshToken) {
console.warn("No refresh token available, redirecting to login");
clearSessionAndRedirect();
return false;
}
try {
const response = await refreshClient.post(getRefreshUrl(), {
refresh_token: currentRefreshToken,
});
const { access_token, refresh_token: newRefreshToken } = response.data.data;
localStorage.setItem("access_token", access_token);
if (newRefreshToken) {
localStorage.setItem("refresh_token", newRefreshToken);
}
return true;
} catch (error) {
console.error(
"Token refresh failed, clearing session and redirecting to login",
);
clearSessionAndRedirect();
return false;
}
}
// Helper to proactively refresh token if needed // Helper to proactively refresh token if needed
export async function ensureValidToken(): Promise<void> { export async function ensureValidToken(): Promise<void> {
const token = localStorage.getItem("access_token"); const token = localStorage.getItem("access_token");
@@ -70,30 +134,7 @@ export async function ensureValidToken(): Promise<void> {
// Check if token is expiring soon (within 5 minutes) // Check if token is expiring soon (within 5 minutes)
if (isTokenExpiringSoon(token, 300)) { if (isTokenExpiringSoon(token, 300)) {
try { await attemptTokenRefresh();
const API_BASE_URL = import.meta.env.VITE_API_BASE_URL || "";
const refreshUrl = API_BASE_URL
? `${API_BASE_URL}/auth/refresh`
: "/auth/refresh";
// Use base axios to avoid circular refresh attempts
const response = await axios.post(refreshUrl, {
refresh_token: refreshToken,
});
const { access_token, refresh_token: newRefreshToken } =
response.data.data;
localStorage.setItem("access_token", access_token);
if (newRefreshToken) {
localStorage.setItem("refresh_token", newRefreshToken);
}
// Token proactively refreshed
} catch (error) {
console.error("Proactive token refresh failed:", error);
// Don't throw - let the interceptor handle it on the next request
}
} }
} }
@@ -105,8 +146,6 @@ export function startTokenRefreshMonitor(): void {
return; // Already running return; // Already running
} }
// Starting token refresh monitor
// Check token every 60 seconds // Check token every 60 seconds
tokenCheckInterval = setInterval(async () => { tokenCheckInterval = setInterval(async () => {
const token = localStorage.getItem("access_token"); const token = localStorage.getItem("access_token");
@@ -121,7 +160,6 @@ export function startTokenRefreshMonitor(): void {
export function stopTokenRefreshMonitor(): void { export function stopTokenRefreshMonitor(): void {
if (tokenCheckInterval) { if (tokenCheckInterval) {
// Stopping token refresh monitor
clearInterval(tokenCheckInterval); clearInterval(tokenCheckInterval);
tokenCheckInterval = null; tokenCheckInterval = null;
} }
@@ -130,7 +168,6 @@ export function stopTokenRefreshMonitor(): void {
// Configure axios defaults to apply to all instances // Configure axios defaults to apply to all instances
export function configureAxiosDefaults(): void { export function configureAxiosDefaults(): void {
// Set default base URL // Set default base URL
const API_BASE_URL = import.meta.env.VITE_API_BASE_URL || "";
if (API_BASE_URL) { if (API_BASE_URL) {
axios.defaults.baseURL = API_BASE_URL; axios.defaults.baseURL = API_BASE_URL;
} }
@@ -138,8 +175,7 @@ export function configureAxiosDefaults(): void {
// Set default headers // Set default headers
axios.defaults.headers.common["Content-Type"] = "application/json"; axios.defaults.headers.common["Content-Type"] = "application/json";
// Copy our interceptors to the default axios instance // Request interceptor — attach JWT to outgoing requests
// This ensures that even new axios instances inherit the behavior
axios.interceptors.request.use( axios.interceptors.request.use(
(config) => { (config) => {
const token = localStorage.getItem("access_token"); const token = localStorage.getItem("access_token");
@@ -153,66 +189,31 @@ export function configureAxiosDefaults(): void {
}, },
); );
// Response interceptor — handle 401 with a single refresh attempt
axios.interceptors.response.use( axios.interceptors.response.use(
(response) => response, (response) => response,
async (error) => { async (error) => {
const originalRequest = error.config as any; const originalRequest = error.config as any;
// Handle 401 Unauthorized - token expired or invalid // Handle 401 Unauthorized token expired or invalid
if (error.response?.status === 401 && !originalRequest._retry) { if (error.response?.status === 401 && !originalRequest._retry) {
originalRequest._retry = true; originalRequest._retry = true;
try { const refreshed = await attemptTokenRefresh();
const refreshToken = localStorage.getItem("refresh_token"); if (refreshed) {
if (!refreshToken) {
console.warn("No refresh token available, redirecting to login");
throw new Error("No refresh token available");
}
// Access token expired, attempting refresh
const refreshUrl = API_BASE_URL
? `${API_BASE_URL}/auth/refresh`
: "/auth/refresh";
const response = await axios.post(refreshUrl, {
refresh_token: refreshToken,
});
const { access_token, refresh_token: newRefreshToken } =
response.data.data;
localStorage.setItem("access_token", access_token);
if (newRefreshToken) {
localStorage.setItem("refresh_token", newRefreshToken);
}
// Token refreshed successfully
// Retry original request with new token // Retry original request with new token
if (originalRequest.headers) { const newToken = localStorage.getItem("access_token");
originalRequest.headers.Authorization = `Bearer ${access_token}`; if (originalRequest.headers && newToken) {
originalRequest.headers.Authorization = `Bearer ${newToken}`;
} }
return axios(originalRequest); return axios(originalRequest);
} catch (refreshError) {
console.error(
"Token refresh failed, clearing session and redirecting to login",
);
localStorage.removeItem("access_token");
localStorage.removeItem("refresh_token");
// Store the current path for redirect after login
const currentPath = window.location.pathname;
if (currentPath !== "/login") {
sessionStorage.setItem("redirect_after_login", currentPath);
}
window.location.href = "/login";
return Promise.reject(refreshError);
} }
// attemptTokenRefresh already cleared session and redirected
return Promise.reject(error);
} }
// Handle 403 Forbidden - valid token but insufficient permissions // Handle 403 Forbidden valid token but insufficient permissions
if (error.response?.status === 403) { if (error.response?.status === 403) {
const enhancedError = error as any; const enhancedError = error as any;
enhancedError.isAuthorizationError = true; enhancedError.isAuthorizationError = true;
@@ -226,18 +227,9 @@ export function configureAxiosDefaults(): void {
return Promise.reject(error); return Promise.reject(error);
}, },
); );
// Axios defaults configured with interceptors
} }
// Initialize the API wrapper // Initialize the API wrapper
export function initializeApiWrapper(): void { export function initializeApiWrapper(): void {
// Initializing API wrapper
// Configure axios defaults so all instances get the interceptors
configureAxiosDefaults(); configureAxiosDefaults();
// The generated API client will now inherit these interceptors
// API wrapper initialized
} }

View File

@@ -53,7 +53,7 @@ export default function PackInstallPage() {
result.data.tests_skipped result.data.tests_skipped
? "Tests were skipped." ? "Tests were skipped."
: result.data.test_result : result.data.test_result
? `Tests ${result.data.test_result.status}: ${result.data.test_result.passed}/${result.data.test_result.total_tests} passed.` ? `Tests ${result.data.test_result.status}: ${result.data.test_result.passed}/${result.data.test_result.totalTests} passed.`
: "" : ""
}`, }`,
); );
@@ -138,8 +138,7 @@ export default function PackInstallPage() {
any git server any git server
</li> </li>
<li> <li>
<strong>Archive URL</strong> - Download from .zip or .tar.gz <strong>Archive URL</strong> - Download from .zip or .tar.gz URL
URL
</li> </li>
<li> <li>
<strong>Pack Registry</strong> - Install from configured <strong>Pack Registry</strong> - Install from configured
@@ -192,8 +191,7 @@ export default function PackInstallPage() {
{/* Source Type Selection */} {/* Source Type Selection */}
<div> <div>
<label className="block text-sm font-medium text-gray-700 mb-2"> <label className="block text-sm font-medium text-gray-700 mb-2">
Installation Source Type{" "} Installation Source Type <span className="text-red-500">*</span>
<span className="text-red-500">*</span>
</label> </label>
<div className="grid grid-cols-3 gap-3"> <div className="grid grid-cols-3 gap-3">
<button <button

View File

@@ -39,7 +39,7 @@ export default function PackRegisterPage() {
result.data.tests_skipped result.data.tests_skipped
? "Tests were skipped." ? "Tests were skipped."
: result.data.test_result : result.data.test_result
? `Tests ${result.data.test_result.status}: ${result.data.test_result.passed}/${result.data.test_result.total_tests} passed.` ? `Tests ${result.data.test_result.status}: ${result.data.test_result.passed}/${result.data.test_result.totalTests} passed.`
: "" : ""
}`, }`,
); );

View File

@@ -0,0 +1,76 @@
# Work Summary: Native Runtime Refactor, Shell Wrapper Cleanup & Stdin Protocol Fixes (2026-02-20)
## Problem
Python sensors (e.g., `python_example.counter_sensor`) were being executed by `/bin/sh` instead of `python3`, causing `import: not found` errors. Root cause was two-fold:
1. **All sensors hardcoded to `core.builtin` runtime**`PackComponentLoader::load_sensors()` ignored the sensor YAML's `runner_type` field and assigned every sensor the `core.builtin` runtime.
2. **`core.builtin` runtime defaulted to `/bin/sh`** — The `InterpreterConfig` default was `/bin/sh`, so runtimes with no `execution_config` (like `core.builtin`) got shell as their interpreter, causing Python scripts to be interpreted as shell.
Additionally, the shell wrapper script had a hard dependency on `python3` for JSON secret parsing, and pre-existing security test failures were discovered caused by a stdin protocol conflict between parameters and secrets.
## Changes
### Architecture: Remove "builtin" runtime concept
Replaced the separate `core.builtin` runtime with `core.native`. Runtime detection is now purely data-driven via `execution_config` in the runtime table — no special-cased runtime names.
- **Deleted** `packs/core/runtimes/sensor_builtin.yaml`
- **Fixed** `packs/core/runtimes/native.yaml` — removed `/bin/sh -c` interpreter; empty `execution_config` signals direct binary execution
- **Changed** `InterpreterConfig` default `binary` from `"/bin/sh"` to `""` (empty = native)
- **Updated** `interval_timer_sensor.yaml``runner_type: native`
### Sensor runtime resolution
- **Fixed** `PackComponentLoader::load_sensors()` — reads `runner_type` from each sensor's YAML definition and resolves to the correct runtime via `resolve_runtime()`. Defaults to `native`.
- **Added** `resolve_runtime()` method returning `(Id, String)` — both ID and ref
- **Updated** runtime mappings — `builtin`, `standalone``core.native`
- **Updated** `load_core_pack.py` — per-sensor runtime resolution from YAML instead of hardcoded `core.builtin`
- **Updated** `seed_core_pack.sql` — references `core.native` instead of `core.builtin`
### Shell wrapper: remove Python dependency
The shell wrapper script (`generate_wrapper_script`) previously used `python3 -c` to parse JSON secrets from stdin into a bash associative array. This created a hard dependency on Python being installed, which violates the principle that core services must operate without supplemental runtimes.
- **Replaced** runtime JSON parsing with Rust-side secret injection — secrets are now embedded directly as `ATTUNE_SECRETS['key']='value'` entries at script generation time
- **Added** `bash_single_quote_escape()` helper for safe bash string embedding
- **Changed** wrapper execution from `bash -c <script>` to writing a temp file and executing it, keeping secrets out of `/proc/<pid>/cmdline`
- **Removed** unused `execute_shell_code()` method (wrapper now uses `execute_shell_file`)
- **Also applied** bash single-quote escaping to parameter values embedded in the wrapper
- **Un-ignored** `test_shell_runtime_with_secrets` — it now passes
### Stdin protocol fixes (pre-existing bugs)
- **Process executor**: Skip writing empty/trivial parameter content (`{}`, `""`, `[]`) to stdin to avoid breaking scripts that read secrets via `readline()`
- **Shell streaming executor**: Same empty-params skip applied
- **Worker env tests**: Replaced flaky env-var-manipulating tests with pure parsing tests to eliminate parallel test interference
## Files Changed
| File | Change |
|------|--------|
| `crates/common/src/models.rs` | Default interpreter binary: `"/bin/sh"``""` |
| `crates/common/src/pack_registry/loader.rs` | Sensors read `runner_type` from YAML; added `resolve_runtime()`; removed `builtin` mapping |
| `crates/common/src/runtime_detection.rs` | Comment update |
| `crates/common/src/schema.rs` | Test: `core.builtin``core.native` |
| `crates/sensor/src/sensor_manager.rs` | Updated `is_native` comments |
| `crates/worker/src/env_setup.rs` | Fixed flaky env var tests |
| `crates/worker/src/runtime/shell.rs` | Rewrote wrapper to embed secrets from Rust (no Python); temp file execution; removed `execute_shell_code`; un-ignored secrets test |
| `crates/worker/src/runtime/process_executor.rs` | Skip empty params on stdin |
| `crates/worker/src/service.rs` | Comment update |
| `packs/core/runtimes/native.yaml` | Removed `/bin/sh` interpreter; empty execution_config |
| `packs/core/runtimes/sensor_builtin.yaml` | **Deleted** |
| `packs/core/runtimes/README.md` | Removed sensor_builtin reference |
| `packs/core/sensors/interval_timer_sensor.yaml` | `runner_type: native` |
| `scripts/load_core_pack.py` | Per-sensor runtime resolution from YAML |
| `scripts/seed_core_pack.sql` | `core.native` references |
| `AGENTS.md` | Updated runtime documentation |
## Test Results
- All 7 security tests pass (2 previously failing now fixed)
- All 82 worker unit tests pass (1 previously ignored now un-ignored and passing)
- All 17 dependency isolation tests pass
- All 8 log truncation tests pass
- All 145 common unit tests pass
- Zero compiler warnings