First of all, let us see what is the meaning of the term “Performance Testing”:
For general engineering practice, “Performance Testing” refers to evaluation & measurement of functional characteristics of an individual, a system, a product or any material.
However in software industry parlance, the term “Performance Testing” widely refers to the evaluation & measurement of functional effectiveness of a software system or a component, as regards its reliability, scalability, efficiency, interoperability & its stability under load.
These days a new science by the name “Performance Engineering” is emerging in IT industry & Performance Testing / Acceptance Testing are being viewed as its subsets. The performance engineering lays prime emphasis on covering the performance aspects in the system design itself i.e. right from the beginning & more important is that well before the start of actual coding.
Why Software Industry lays so much emphasis on Performance Testing:
The key reasons are:
1) Performance has become the key indicator of product quality and acceptance consideration nowadays in a highly dynamic & competitive market.
2) Customers are becoming extremely demanding on quality front & have clear vision of their performance objectives.
3) These days, every customer is looking for greater speed, scalability, reliability, efficiency & endurance of all applications – may it be multi tier applications, web based applications or client server applications etc. etc.
4) Greater need for identifying & eliminating the performance inhibiting factors early during the development cycle. It is best to initiate the performance testing efforts right from the beginning of the development project & these remain active till final deployment.
What are the objectives of Performance Testing?
1) To carry out root cause analysis of performance related common & uncommon problems & devise plans to tackle them.
2) To reduce the response time of the application with minimal investment on hardware.
3) To identify the problems causing the malfunctioning of the system & fix them well before the production run. Problems remedied during later stages of production have high cost tags attached to them.
4) Benchmarking of the applications, with a view to refine the company’s strategy towards software acquisition for the next time.
5) To ensure that the new system conforms to the specified performance criteria.
6) To draw a comparison among performance of two or more systems.
Typical Structure of a Performance Testing Model:
Step-1: Collection of Requirements – The most important step & the backbone of performance test model
Step-2: System Study.
Step-3: Design of Testing Strategies – Can include the following.
a) Preparation of traversal documents.
b) Scripting Work.
c) Setting up of test environment.
d) Deployment of monitors.
Step-4: Test Runs can cover the following
a) Baseline Test Run
b) Enhancement Test Run
c) Diagnostic Test Run
Step-5: Analysis & preparation of an interim report.
Step-6: Implementation of recommendations from step-5.
Step-7: Preparation of a Finalized Report.
Attributes of a Good Performance Testing setup:
1) Availability of a performance baseline document detailing the present performance of the system & acting as an effective baseline, which can be used in regression testing. This baseline document can be conveniently used to compare the expectations when the system conditions happen to change.
2) Performance test beds & test environment should be separate & must replicate the live production environment as far as possible.
3) Performance testing environment should not be coupled with the development environment.
4) Resources leading to fulfillment of objectives like:
# Deployment of personnel with sound knowledge
# Systematic & deliberate planning
# Study of existing infrastructure
# Proper preparation
# Systematic execution
# Scientific analysis
# Effective reporting
However these days many companies have started doing part of the testing under the live environment, This helps them in establishing points of differences experienced during test & live systems.
How to gear up for Performance Testing?
1) Define the performance conditions: First of all we need to define performance conditions related to functional requirements like speed, accuracy & consumption of resources. Resources can be like memory requirements, storage space requirements & bandwidth of the communication system etc. etc.
2) Study the operational profile: The operational profile contains details of usage patterns and environment of the live system. It includes description of the period of operation, the operating environment, quantum of loads & expected transactions etc. When exact data is not available, the data from the testing profiles can be approximated especially when testing is not being done under the live environment.
3) Prepare good performance test cases: While designing performance test cases, our endeavor must be to
a) Understand the present performance levels & to use this information for benchmarking at a later date.
b) Evaluate the performance requirements of the system against the specified norms.
c) Clearly specify the system inputs and the expected outputs, when the system is subjected to the defined load conditions like profile of the test, test environment & the test duration etc.
Ways of doing Performance Testing:
Conventionally there are two methods of performance testing like
1) Manual performance testing
2) Automated performance testing
1) Manual Performance Testing: In order to develop an adequate confidence, the response times being a good indicator of performance of a transaction must be measured several times during the test. Use of stopwatches monitored by many persons is one of the oldest & effective way to measure the test performance. Depending upon the available infrastructure, other means can also be devised.
2) Automated Performance Testing: Many approaches can be practiced here. We can use the automation software which can simulate the users actions & can simultaneously record the response times & various system parameters like access of storage discs, usage of memory & queue length for various messages etc. etc.
We can provide additional data load over the system, through many utility programs, message replication programs, batch files & many protocols analyzing tools etc.
Important Considerations for Designing Good Performance Test Cases:
1) Stress: To take care of the ability of a system or its component to move beyond the specified limits of performance requirements.
2) Capacity: To cover the maximum amounts which can be contained, or produced, or completely fully occupy the entity.
3) Efficiency: To take care of the desired efficiency measured as the ratio of volume of data processed to the amount of resources consumed for the particular processing.
4) Response time: To take care of the specified requirements of response time i.e. the total time elapsed between the event of initiation of request to the receipt of response.
5) Reliability: Must be able to deliver the expected results with sufficient consistency.
6) Bandwidth: Must be able to measure & evaluate the bandwidth requirements i.e. the amount of data passing across the system.
7) Security: Must be able to evaluate the user confidentiality, access permissions & data integrity considerations in the system.
8) Recovery: Must be able to subject the system under test to higher loads, and measure the time it takes to the normal situation after withdrawal of loads.
9) Scalability: Must be able to handle more loads by the addition of more hardware elements components without any coding change.