Performance Journey
A methodical approach to performance testing, walking through requirements gathering, test asset creation, execution, and reporting for new APIs.
Mark
Performance Testing Expert
This article presents a practical walkthrough of performance testing methodology using a fictitious scenario where ten new APIs are added to an existing web application.
The Scenario
The company needs to integrate new customer database APIs while maintaining existing SLAs with one week until launch. The requirements:
- Support 500 concurrent users
- 100 transactions per second baseline
- New APIs called twice per user journey
- Maintain existing SLAs
Phase 1: Gap Analysis
Before diving in, identify what’s missing:
- Missing non-functional requirements
- Insufficient test data
- Undocumented volumetrics
- Unclear success criteria
Phase 2: Requirements Gathering
Leverage existing SLAs to define targets:
| Metric | Target |
|---|---|
| Page response time | 5 seconds |
| 99th percentile | 4 seconds |
| Individual API response | Sub-second |
| Total throughput | 300 TPS |
Each new API should respond in sub-second timeframes. Combined infrastructure must handle 300 total transactions per second.
Phase 3: Test Asset Creation
Develop JMeter scripts for:
- Isolated API testing - Test each API independently
- Combined API testing - Test all APIs together
- Integrated testing - Test APIs within the full user journey
Test Data Requirements
For different test durations:
| Duration | Data Rows Required |
|---|---|
| 1 hour | 720,000 |
| 8 hours | 5,760,000 |
Plan your test data generation accordingly.
Phase 4: Test Execution
The test strategy includes:
- Baseline tests - Establish performance benchmarks
- Load tests (isolated) - Test individual APIs under load
- Load tests (combined) - Test all APIs together
- Soak tests - 8-hour duration to identify memory leaks
- Stress tests - Find breaking points
Phase 5: Reporting
Use comprehensive analysis tools:
- Grafana - Real-time metrics visualization
- JMeter HTML reports - Detailed response time analysis
- Hardware monitoring - CPU, memory, disk I/O
- Application logs - Error analysis and debugging
Communicate findings throughout testing, not just at the end.
Key Takeaway
Understand the requirements. Then design a test scenario that will stimulate the application sufficiently to analyse its behaviour when placed under load.
Further Reading
Tags: