Skip to main content
Quick Access:
  • Use test panels to validate step functionality
  • Review command execution logs and traces
  • Inspect output data from each command
  • Monitor validation statistics for errors
This guide covers debugging techniques, testing methods, and troubleshooting strategies to help you identify and resolve issues in your Bringits Stream projects.

Testing & Validation

Step Test Panel

Each step includes a test panel for validation: Test Panel Features:
  • Input fields for required test parameters
  • Execute step with test data
  • View detailed test results
  • Identify configuration issues
When to Use:
  • After modifying step configuration
  • Troubleshooting validation errors
  • Verifying command changes
  • Testing with sample data
1

Open Test Panel

Expand the test section within a step card.
2

Enter Test Parameters

Fill in required input fields with test data.
3

Run Test

Execute the step and review output.
4

Analyze Results

Check test output for errors or unexpected results.

Testing Best Practices

Best practices for testing:
  • Use predefined variables to simplify debugging
  • Track session IDs with #{UUID} for multi-step flows
  • Monitor timeouts - Ensure commands complete within specified timeouts
  • Test incrementally - Verify each step independently before connecting them
The Bringits Stream ecosystem includes debugging tools and sandbox environments for testing projects before production deployment.
Always test your project in a development or sandbox environment before deploying to production.

Viewing Output Data & Logs

Command Execution Logs

Monitor your project execution: Logs - View command execution logs and traces to understand what’s happening at each step. Data Preview - Inspect output from each command to verify data extraction is working correctly. SINK Verification - Check message queue consumption to confirm data is being published successfully.

Output Validation

Output Validation:
  • Verify data format matches expected structure
  • Confirm all required fields are present
  • Test with various input scenarios

Interpreting Statistics

Monitor validation statistics to identify issues: Healthy Step Indicators:
  • High valid count relative to invalid
  • Consistent new item discovery
  • Low timeout occurrences
Warning Signs:
  • Increasing invalid count over time
  • Zero new items for extended periods
  • Frequent timeout errors
  • Sudden drops in valid count
High invalid counts may indicate:
  • Configuration errors in commands
  • Changes in source data structure
  • Network or connectivity issues
  • Validation rule mismatches

Common Issues & Troubleshooting

Command Failures

Command failures:
  • Review command traces for navigation events
  • Check timeout settings if commands fail
  • Validate element selectors for browser commands

Variable Issues

Variable issues:
  • Ensure secrets and variables are properly configured
  • Verify variable interpolation is working correctly
  • Check variable scope and availability

Sink Issues

Sink issues:
  • Verify exchange and routing key configuration
  • Check message queue connectivity
  • Confirm consumer is running and processing messages

Validation Error Detection

Common Validation Issues:
  1. JSONPath Query Errors
    • Verify JSONPath syntax
    • Check source data structure
    • Test queries incrementally
  2. HTTP Request Failures
    • Verify URL format
    • Check authentication credentials
    • Validate request parameters
  3. Data Structure Mismatches
    • Review expected vs actual data format
    • Update parsing commands accordingly
    • Check for API changes

Performance Debugging

Troubleshooting Performance Issues

High Invalid Count:
  1. Review validation statistics for patterns
  2. Use test panel to identify failing commands
  3. Check source data structure changes
  4. Verify JSONPath queries are correct
  5. Review command configurations
Zero New Items:
  1. Verify source data is updating
  2. Check for API changes or restrictions
  3. Review filtering logic
  4. Test with sample data
  5. Check network connectivity
Timeout Errors:
  1. Review timeout settings
  2. Check network performance
  3. Optimize command execution
  4. Consider increasing timeout values
  5. Review step complexity

Performance Optimization Tips

Improve Valid Count:
  • Review and fix command configurations
  • Update JSONPath queries for current data structure
  • Verify HTTP request parameters
  • Check data validation rules
Reduce Invalid Count:
  • Identify patterns in invalid results
  • Update parsing logic for edge cases
  • Add error handling for missing data
  • Validate data structure before processing
Increase New Item Discovery:
  • Verify source data is updating
  • Check for API or website changes
  • Review filtering logic
  • Ensure proper data extraction paths

Debugging Checklist

Use this checklist when debugging your project:
  • Test each step independently using test panels
  • Review command execution logs and traces
  • Verify output data format matches expectations
  • Check validation statistics for error patterns
  • Validate variable interpolation is working
  • Test with sample data before production
  • Verify timeout settings are appropriate
  • Check network connectivity and API access
  • Review element selectors for browser commands
  • Confirm sink configuration and message consumption
  • Test in sandbox environment before deployment