11 KiB
End-to-End Integration Testing Setup
Date: 2026-01-17 (Session 6)
Phase: Production Readiness - Integration Testing
Status: 🔄 IN PROGRESS
Priority: P0 - BLOCKING
Overview
Set up comprehensive end-to-end integration testing infrastructure to verify all 5 Attune services work correctly together. This is a critical milestone before production deployment.
What Was Accomplished
1. Test Planning & Documentation
Created: tests/README.md (564 lines)
Comprehensive test plan covering:
- 8 Test Scenarios: Timer automation, workflows, FIFO queues, secrets, inquiries, error handling, notifications, dependency isolation
- Test Infrastructure: Prerequisites, service configuration, running tests
- Debugging Guide: Service logs, database queries, message queue inspection
- Success Criteria: Clear checklist for passing tests
Test Scenarios Defined:
-
Basic Timer Automation (~30s)
- Timer → Event → Rule → Enforcement → Execution → Completion
- Verifies core automation chain
-
Workflow Execution (~45s)
- 3-task sequential workflow
- Verifies task ordering and variable propagation
-
FIFO Queue Ordering (~20s)
- 5 executions with concurrency limit
- Verifies execution ordering preserved
-
Secret Management (~15s)
- Action uses secrets via stdin
- Verifies secrets not in environment
-
Human-in-the-Loop (Inquiry) (~30s)
- Execution pauses for user input
- Verifies pause/resume flow
-
Error Handling & Recovery (~25s)
- Action fails with retries
- Verifies retry logic
-
Real-Time Notifications (~20s)
- WebSocket updates on execution changes
- Verifies notification delivery
-
Dependency Isolation (~40s)
- Two packs with conflicting dependencies
- Verifies per-pack virtual environments
2. E2E Test Configuration
Created: config.e2e.yaml (204 lines)
Test-specific configuration:
- Separate test database:
attune_e2e - Different ports: API=18080, WebSocket=18081
- Faster polling intervals for quicker tests
- Lower bcrypt cost for faster auth tests
- Test-specific directories for artifacts/logs/venvs
- Minimal logging (info level)
- All features enabled
Key Settings:
environment: test
database.url: postgresql://postgres:postgres@localhost:5432/attune_e2e
server.port: 18080
executor.enforcement_poll_interval: 1 # Faster for tests
sensor.poll_interval_seconds: 2 # Faster for tests
worker.max_concurrent_executions: 10
3. Test Fixtures
Created Test Pack: tests/fixtures/packs/test_pack/
Pack Metadata (pack.yaml):
- Pack ref:
test_pack - Version: 1.0.0
- Python dependency: requests>=2.28.0
- Runtime: python3
Echo Action (actions/echo.yaml + echo.py):
- Simple action that echoes a message
- Supports delay parameter for timing tests
- Supports fail parameter for error testing
- Returns timestamp and execution time
- 87 lines of Python implementation
Features:
- JSON input/output via stdin/stdout
- Parameter validation
- Configurable delay (0-30 seconds)
- Intentional failure mode for testing
- Error handling and logging
Simple Workflow (workflows/simple_workflow.yaml):
- 3-task sequential workflow
- Tests task ordering
- Tests variable passing
- Tests workflow completion
- Input parameters: workflow_message, workflow_delay
Task Flow:
task_start- Echo start message, publish start_timetask_wait- Delay for specified secondstask_complete- Echo completion message
4. Test Infrastructure Setup
Created Directory Structure:
tests/
├── README.md # Test documentation (564 lines)
├── fixtures/ # Test data
│ └── packs/ # Test packs
│ └── test_pack/ # E2E test pack
│ ├── pack.yaml # Pack metadata
│ ├── actions/ # Action definitions
│ │ ├── echo.yaml # Echo action spec
│ │ └── echo.py # Echo action implementation
│ ├── workflows/ # Workflow definitions
│ │ └── simple_workflow.yaml
│ └── sensors/ # Sensor definitions (empty)
5. Documentation Updates
Updated: work-summary/TODO.md
- Marked API authentication as complete
- Reorganized priorities with E2E testing as Priority 1
- Updated success criteria checklist
- Added E2E testing to critical path
Test Infrastructure Components
Services Required
- PostgreSQL - Database (port 5432)
- RabbitMQ - Message queue (ports 5672, 15672)
- Redis - Cache (optional, port 6379)
Attune Services
- API - Port 18080
- Executor - Background service
- Worker - Background service
- Sensor - Background service
- Notifier - Port 18081 (WebSocket)
Database Setup
# Create E2E test database
createdb attune_e2e
# Run migrations
export DATABASE_URL="postgresql://postgres:postgres@localhost:5432/attune_e2e"
sqlx migrate run
Running Tests (Manual)
Start All Services
Terminal 1 - API:
cd crates/api
ATTUNE__CONFIG_FILE=../../config.e2e.yaml cargo run
Terminal 2 - Executor:
cd crates/executor
ATTUNE__CONFIG_FILE=../../config.e2e.yaml cargo run
Terminal 3 - Worker:
cd crates/worker
ATTUNE__CONFIG_FILE=../../config.e2e.yaml cargo run
Terminal 4 - Sensor:
cd crates/sensor
ATTUNE__CONFIG_FILE=../../config.e2e.yaml cargo run
Terminal 5 - Notifier:
cd crates/notifier
ATTUNE__CONFIG_FILE=../../config.e2e.yaml cargo run
Next Steps
Phase 1: Infrastructure (Current - 80% Complete)
- Document test plan
- Create config.e2e.yaml
- Create test fixtures
- Set up directory structure
- Create test database and seed data
- Verify all services start with E2E config
- Create test helper utilities
Phase 2: Basic Tests (Next)
- Implement helper modules (api_client, service_manager)
- Write timer automation test
- Write workflow execution test
- Write FIFO ordering test
- Verify all basic scenarios pass
Phase 3: Advanced Tests
- Write secret management test
- Write inquiry flow test
- Write error handling test
- Write notification test
- Write dependency isolation test
Phase 4: Automation
- Create service start/stop scripts
- Create automated test runner
- Add CI/CD integration
- Add performance benchmarks
Technical Decisions
Why Separate E2E Config?
- Isolation: Separate database prevents test pollution
- Different Ports: Avoid conflicts with dev services
- Faster Polling: Reduce test duration
- Lower Security: Faster tests (bcrypt_cost=4)
- Minimal Logging: Cleaner test output
Why Test Fixtures?
- Consistency: Same pack used across all tests
- Simplicity: Echo action is trivial to verify
- Flexibility: Supports delay and failure modes
- Realistic: Real pack structure, not mocks
Why Manual Service Start First?
- Debugging: Easier to see service output
- Iteration: Faster test development cycle
- Validation: Verify config works before automation
- Later: Automate once tests are stable
Files Created
tests/README.md- Test documentation (564 lines)config.e2e.yaml- E2E test configuration (204 lines)tests/fixtures/packs/test_pack/pack.yaml- Pack metadata (51 lines)tests/fixtures/packs/test_pack/actions/echo.yaml- Action spec (43 lines)tests/fixtures/packs/test_pack/actions/echo.py- Action implementation (87 lines)tests/fixtures/packs/test_pack/workflows/simple_workflow.yaml- Workflow (56 lines)work-summary/2026-01-17-e2e-test-setup.md- This document
Total: 7 files, ~1,000 lines of test infrastructure
Files Modified
work-summary/TODO.md- Updated priorities and statuswork-summary/TODO.OLD.md- Moved old TODO for archival
Challenges & Solutions
Challenge: Multiple Services to Coordinate
Solution: Created clear documentation with step-by-step service startup instructions
Challenge: Test Isolation
Solution: Separate database, different ports, dedicated config file
Challenge: Fast Test Execution
Solution: Faster polling intervals, lower bcrypt cost, minimal logging
Challenge: Realistic Test Data
Solution: Created actual pack with real action structure (not mocks)
Success Criteria
For E2E tests to be considered complete:
- Test plan documented (8 scenarios)
- Test infrastructure created
- Test fixtures created (pack + action + workflow)
- E2E configuration file created
- All services start with E2E config
- Database seeded with test data
- Basic timer test passing
- Workflow test passing
- All 8 test scenarios passing
- No errors in service logs
- Clean shutdown of all services
Estimated Timeline
Phase 1: Setup (Current)
- Infrastructure setup: ✅ Complete
- Database setup: ⏳ 1 hour
- Service verification: ⏳ 2 hours
- Helper utilities: ⏳ 2 hours
- Total: ~5 hours remaining
Phase 2: Basic Tests
- Timer automation test: 2-3 hours
- Workflow execution test: 2-3 hours
- FIFO ordering test: 2-3 hours
- Total: 6-9 hours
Phase 3: Advanced Tests
- Secret management: 2-3 hours
- Inquiry flow: 2-3 hours
- Error handling: 2-3 hours
- Notifications: 2-3 hours
- Dependency isolation: 2-3 hours
- Total: 10-15 hours
Phase 4: Automation
- Scripts: 2-3 hours
- CI/CD: 3-4 hours
- Total: 5-7 hours
Grand Total: 26-36 hours (3-5 days)
Benefits Achieved
- Clear Test Strategy: 8 well-defined scenarios
- Comprehensive Documentation: 564 lines of test guide
- Realistic Fixtures: Actual pack structure for testing
- Isolated Environment: Won't interfere with development
- Fast Iteration: Faster polling for quicker test runs
- Production-Like: Tests full service integration
Lessons Learned
- Document First: Writing test plan revealed edge cases
- Realistic Fixtures: Better than mocks for integration tests
- Separate Config: Essential for test isolation
- Manual First: Easier to debug before automation
Next Session Goals
-
Create E2E Database
- Create attune_e2e database
- Run migrations
- Seed with test pack and user
-
Verify Service Startup
- Start all 5 services with E2E config
- Verify database connections
- Verify message queue connections
- Check for any configuration issues
-
Implement First Test
- Create test helper utilities
- Write timer automation test
- Get first green test passing
Status: 🔄 IN PROGRESS (Phase 1: ~80% complete)
Priority: P0 - BLOCKING
Confidence: HIGH - Clear path forward
Next Milestone: All services running with E2E config