Krishanu Konar

3 minute read


Used for Performance testing

  • Use virtual users which concurrently perform requests/transactions on the website.
  • Use scripts to model these requests.
  • Can group multiple scripts to perform a single test.
  • All tests running together constitute the “workload”.

KPI (Key Performance Indicators)

  • Response Time: Time taken by the request to travel over the network, the preocessing time by the server, and the time taken by the server to send the response back to the client.
  • Latency: Subset of response time, time taken until first byte from the server response is recieved by the client.
  • Render Time: Time taken by the browser to render the response once it has recieved from the server. Not considered in JMeter, as it is client dependent.
  • Max concurrent Virtual Users: Max no. of users the server can handle before crashing/being unresponsive.
  • Throughput: Capability of handling multiple requests/transactions in a period; Work done in unit time; Responses/sec (or per min.)
    • Light Zone: low throughput, increases with no. of concurrent users until saturation point
    • Heavy Zone: almost constant throughput with increasing VU
    • Buckle Zone: throughput starts decreasing with increasing VU.

Load Test

  • Test upper limit of the system (expected peak traffics, SLA fulfillment etc.)
  • Fixed number of concurrent VUs, iterations of requests.
  • Ensuring application is able to achieve performance expected; determine operating capacity.

Stress Test

  • Load until system failure and see how system deals with the failure.
  • Increase no. of concurrent VUs or data.
  • Continuous interations.
  • Used for benchmarking and finding bottlenecks.
    • Spike Test: Sudden increase in number of VUs.
    • Soak Test: long period of testing, slow ramp-up.
  • Test durability, stability and recovery of the system.

Endurance Test

  • Fixed loads for longer duration.
  • Fixed number of concurrent VU.

Testing Process

  • Test script modeling: Identify performance requirements, main user profiles and business transactions for them; convert them into a a test script.

  • Generate Test Data: Permutations/Combinations of the inputs that can be provided to the server and run scripts for all this data.

  • Environment Prep: System under test (Web-servers, backends, databases etc.), Load generators (setting up servers which would test the systems)

  • Test Plan and Execution: Put all the performance test scripts and assign VUs to these tests. Run these test, make changes and run tests again until we get a consistent set of test results.

  • Test Reporting: Status and final reports; must have testing approach, data and env used, scripts used, results, benchmarks and summary.

JMeter GUI

  • ThreadGroup: Simulate VU; set iterations and ramp-up.
  • Sampler: Simulating the requests to send (eg. HTTP, JDBC request) etc.
  • Logic Controllers: organising samplers and logics.
  • Listeners: Displaying results of the tests.
  • Timers: setting delays b/w requests.
  • Assertions: Tests of the responses from the server.

Resources used

comments powered by Disqus