Skip to main content

What is Testing?

Thorough testing is essential before deploying voice agents to production. Hamsa provides multiple testing methods to validate conversation flows, debug issues, and ensure quality experiences for your callers.
Testing Methods Available:
  • Browser Testing - Quick testing with microphone in your browser
  • Phone Testing - Real phone call testing with actual voice quality
  • Call Simulation - Automated test scenarios with assertions
  • Live Call Monitoring - Real-time debugging during actual calls

Testing Methods

Browser Testing
  • Test directly in browser without making phone calls
  • View real-time transcripts and variable extraction
  • Monitor node transitions and execution logs
  • Perfect for rapid iteration during development
Phone Testing
  • Test with actual phone calls for production-ready validation
  • Verify voice quality and DTMF functionality
  • Experience real-world network conditions
  • Test from different locations and devices
Call Simulation
  • Create automated test scenarios
  • Run regression tests after changes
  • Validate specific conversation paths
  • Ensure consistent behavior
Live Debugging
  • Monitor active calls in real-time
  • View transcripts as they happen
  • Track variable extraction and tool calls
  • Identify and fix issues immediately

What to Test

Conversation Quality
  • Natural greetings and introductions
  • Accurate responses to common questions
  • Smooth conversation flow and transitions
  • Professional closings and next steps
Technical Functionality
  • Variable extraction accuracy
  • Tool integration and API calls
  • DTMF menu navigation
  • Webhook delivery
Voice & Audio
  • Voice clarity and naturalness
  • Appropriate speaking pace
  • Background noise handling
  • Interruption management
Error Handling
  • Recovery from invalid inputs
  • Timeout and silence detection
  • System error management
  • Graceful fallback responses

Common Test Scenarios

Happy Path Testing
  • Ideal conversation flows
  • All information provided correctly
  • Successful task completion
Error Handling Testing
  • Invalid inputs and edge cases
  • Missing or incorrect information
  • System failures and timeouts
  • Recovery and retry logic
Edge Case Testing
  • Multiple speakers on call
  • Heavy background noise
  • Poor network connections
  • Unexpected user responses

Getting Started

Choose your testing approach:

Learn More

Best Practices

Test Early, Test Often
  • Test after every significant change
  • Maintain a suite of regression tests
  • Test with real users before launch
  • Continue monitoring in production
Test Like a Real User
  • Use natural language, not scripts
  • Include unexpected inputs
  • Test unhappy paths and edge cases
  • Simulate various user personas
Browser vs Phone Testing
  • Use browser testing for rapid iteration
  • Always validate with phone testing before launch
  • Test from multiple locations and networks
  • Verify DTMF and voice quality on phone
Documentation and Tracking
  • Document test scenarios and expected results
  • Track issues found and their resolutions
  • Maintain test history for regression analysis
  • Create checklists for pre-launch validation
Automated Testing
  • Build regression test suites
  • Integrate testing into CI/CD pipelines
  • Set up monitoring and alerts
  • Generate automated test reports
Performance Testing
  • Test under expected load
  • Find breaking points with stress testing
  • Monitor response times and latency
  • Track error rates and success metrics