The “Tick Ticket” is a new Internet site planned to meet the following
10 The “Tick Ticket” is a new Internet site planned to meet the following
requirements:
Sof
The site should be able to handle up to a maximum of 3000 hits per hour.
tw are tes ■ Average reaction time required for the maximal load of 3000 hits per
hour is 10 seconds or less.
Average reaction time required for the regular load of 1200 hits per hour
ting – impl
is 3 seconds or less. The plan: The load tests were planned for the following series of hit fre-
quencies (hits per hour): 300, 600, 900, 1200, 1500, 1800, 2100, 2400,
ement 2700, 3000, 30 and 3600. An initial hardware configuration was defined,
to be adapted according to the load test results. Implementation: Three series of load tests were run before the adequate
ation
hardware and communication software configuration was determined. After the first and second series of load tests, the hardware configuration was changed to increase the system’s capacity so as to achieve the required reac- tion times. The second configuration fulfilled the reaction time requirements for the average load but not for the maximal load. Therefore, capacity was further increased. In its final configuration, the software system could satis- factorily handle loads 20% higher than the originally specified maximal load. See Table 10.5 for the average reaction times measured at each round of load testing.
Table 10.5: Tick ticket load tests – measured reaction times Average reaction time (seconds) for load tests
Series III (hits per hour)
Hit frequency
Series I
Series II
(hardware
(hardware (hardware
configuration III) 300
configuration I)
configuration II
Test management
Testing involves many participants occupied in actually carrying out the tests and correcting the detected errors. In addition, testing typically monitors per-
10.3 Autom
formance of every item on long lists of test case files. This workload makes timetable follow-up important to management. Computerized test manage- ment supports these and other testing management goals. In general, computerized test management tools are planned to provide testers with reports, lists and other types of information at levels of quality and availabil-
ated tes
ity that are higher than those provided by manual test management systems. Automated test management software packages provide features appli- cable for manual as well as automated testing and for automated tests only.
ting
The inputs the testers key in, together with the software package’s capabili- ties, determine the application’s scope. Especially important here is the package’s interoperability with respect to the automated testing tools.
Frame 10.6 provides a concise summary of the features offered by auto- mated test management software packages.
Frame 10.6 Automated test management packages – main features Type of feature
Automated/manual
testing
A. Test plans, test results and correction follow-up
Preparation of lists, tables and visual presentations of test plans
A, M
List of test case A, M Listing of detected errors
A, M
Listing of correction schedule (performer, date of completion, etc.)
A, M
Listing of uncompleted corrections for follow-up
A, M
Error tracking: detection, correction and regression tests
A, M
Summary reports of testing and error correction follow-up
A, M
B. Test execution
Execution of automated software tests
Automated listing of automated software test results
Automated listing of detected errors
C. Maintenance follow-up
Correction of errors reported by users
A, M
Summary reports for maintenance correction services according to customer, software system applications, etc.
A, M
242 The availability of automated testing tools