Test Strategy And Approach For Online Backstage Management System

Purpose of the Document

Purpose of this document

Save Time On Research and Writing
Hire a Pro to Write You a 100% Plagiarism-Free Paper.
Get My Paper

The purpose of this document is to outline the test strategy and overall test approach for the Online Backstage Management System. This includes test methodologies, traceability, resources required, and estimated schedule.

This section describes the objectives and extent of the tests. The goal is to provide a framework that can be used by managers and testers to plan and execute the necessary tests in a timely and cost-effective manner.

OBMS

Online Backstage Management System

Save Time On Research and Writing
Hire a Pro to Write You a 100% Plagiarism-Free Paper.
Get My Paper

RSSS

Royal South Street Society

SADD

Software Architecture and Design Document

JSP

Java Server Pages

XML

Extensible Markup Language

AJAX

Asynchronous Java Server Pages

HTML

Hypertext Markup Language

CSS

Cascaded Style Sheets

The OBMS is a web based application developed for the RSSS in order to automate the management of competitors and results. They wish to migrate from their paper system to the electronic online system to save on operational costs, risks and time (Wohlin, Runeson, Höst, Ohlsson, Regnell, and Wesslén, 2014)

1.3.1 Software Architecture overview (Imported from the SADD)

The OBMS 3 tier architecture and associated technologies

System components

Use case diagram

Login

This section describes the general approach to the testing process. It discusses the reasons for the selected integration testing strategy. Different strategies are often needed to test different parts of the system (Pressman, 2015).

Unit testing and component testing will be performed on the components as they are being developed. Tests will be executed using test code in the form of either custom test tools or as an automated suite of tests run against the components in their individual sandboxes (Bourque, Fairley, 2014).

Integrations tests will be performed by both the component testers as well as the system testers. The BAT and the unit test suite will be used as a regression during the integration of components. However, as the integration begins to include GUI level functionality, the tests being run will utilize significantly more manual testing and less automated testing.

Because the components will be developed from the bottom-up and top-down, the test strategy will also align to the order of development of components. This will utilize a mostly bottom-up integration test approach, but will also involve the sandwich integration test approach.

This section outlines the feature of the OBMS that will be tested and those that will not be tested. It also describes the tools that will be utilized and the environment in which the tests will be carried out.

Components developed in house

All the components developed by this organization will be tested for RSSS.

Off-The-Shelf and third party components

It is assumed that off the shelf and third party components were evaluated and the pros and cons properly weighed before choosing that component with our software. The interfaces to those components will be tested, but not the functionality or performance those components.

Software Architecture Overview

The MySQL database management software employed is assumed to work as designed and will not be directly tested for functionality. However, performance tests will be done during system test with respect to GUI response time that will involve the database. The database will not be directly tested.

No direct tests will be carried out on the internet/Wi-Fi backbone either. It will only be utilized during testing of the system components and functionalities.

This section identifies the resources, which include hardware, software, special test tools, and other resources needed to support testing (Bourque, Fairley, 2014).

The team will need a lab for the testing exercise. A lab area of about 600 sq ft in size, with 200 sq ft of desktop space will suffice for the testing procedure. The lab area needs multiple power outlets on each side of the room. A table in the center of the room would also allow for easy-access test team technical discussions.

To enable the team to test in an optimal environment, the lab needs to contain 4 copies of the system under test. The hardware components of the system are a Database Server, a Web Server, three client PCs with a web browser that supports Java, and the embedded OBMS. The three client PCs allow the team to test several components in parallel.

The database (MySQL) will be installed, setup, and configured properly in the database server.

The Web Server machine will have the Apache web server installed and services started so that it can properly function as a Web Server.

The client machine needs a compatible version of the set of JDK tool kits installed and properly configured with the Firefox web browser.

Additional tools and software may need to be purchased or otherwise acquired or reused. Such software is used to execute special tests that are best run and results recorded and analyzed by automated means. For Load testing this requires a tool like LoadRunner. For Security testing at the compiled source code level this requires a tool like FindBugs.

This section is the main concern of the test plan. It outlines the test cases that are used during testing. Each test case is described in detail in its own Test Case Specification document (Vij, McClure and Ekaireb, 2014). All execution of these tests will be documented in a Test Incident Report document.

Case tests for functional requirements

Case 1 Import data from guide book

Testing Process Approach

This function is for capturing the guide book details from the office system into the OBMS database

Guide book document in word format

Expected Output and Pass/Fail criteria

All the information from the word document should be input to the OBMS database. There are no pass/fail criteria

  • Selection of file to be imported
  • Retrieval and parsing of the file by the system
  • Displaying of any failed parse/entries by the system
  • Fixing of possible parse/entry failures
  • Storing of information by the system as the current year’s
  • Case 2 Register competitor

This function is for competitors to register when they arrive on the competition day to let the administrators know they are there and divulge any necessary information for their event.

Competitor’s age, name, section, and other information for section

Expected Output and Pass/Fail criteria

The expected output is the competitor being listed as “Registered” and “Available to compete”. The pass/fail criteria is that once registered, the system should indicate that the competitor is registered and available for competition.

  • User selects the section to perform the registration on
  • System retrieves and displays the list of competitors for that section
  • User selects competitor from list and marks them as registered
  • System  prompts user for required information for this section
  • User enters information
  • System stores input information and marks competitor as “Registered”
  • Case 3 Configurable screens

The system’s screens should be configurable so that each user can see the information they want to see.

The various types of information that must be displayed or collected

Expected Output and Pass/Fail criteria

A built screen displaying user selection. There are no pass/fail criteria

  • User goes to the admin panel, and chooses a role
  • A list of valid data that the role can view is produced
  • Fields which the user can and cannot see are selected
  • System stores the user’s selection.
  • Case 4 Recording of results

The adjudicators should be able to store records pertaining to competitors who won in the performances.

Competitor name, score, and place

Expected Output and Pass/Fail criteria

The competitors’ place order. There are no pass/fail criteria

    4.1.4.4 Test Procedure

  • Navigate to the section screen
  • A list of competitors’ numbers is displayed and an option to associate a result with each
  • User then associates a result with each competitor
  • The system stores the results
  • Case 5 Distribute information to key staff

This should enable the RSSS staff to eliminate paper processes and dynamically distribute information to key staff members for updates.

Any updates

The updated information being displayed on the recipients’ devices. There are no pass/fail criteria.

  • User goes to particular section’s screen
  • A list of competitors and details for the section is displayed
  • User updates information
  • Updated information is stored by the system and distributed to stakeholders

The system must be able to print out information in hard copy for back up in case of system failure.

Section information

Expected Output and Pass/Fail criteria

The printed report. There are no pass/fail criteria

  • User selects the reports screen
  • A list of reports is displayed
  • User selects target report
  • User prompts for any information required for the report
  • User inputs required information
  • Report is displayed in a printable format
  • User prints report
  • Case 7 Group/school registration

The system should not only allow individuals to register, but also groups or schools.

Group/school name

The expected output is the group/school registered. The pass/fail criteria are; a group/school must be in the list, incorrect or incomplete information and incomplete fields should be highlighted.

  • User enters the name of the group/school to search for
  • System searches competitors and displays a list of matches
  • User selects the right group/school from the list
  • Group/school information is displayed with a list of sections a competitor must register for.
  • Case 8 Move competitor between sections

The system should allow the chairman or manager to move a competitor to a different section before he/she registers.

The competitor’s name

Expected Output and Pass/Fail criteria

The competitor is moved to new section and a confirmation message. There are no pass/fail criteria.

  • User navigates to section the competitor is in, and chooses to move them.
  • A list of sections the competitor can be moved to is displayed
  • User selects the section to move the competitor to
  • System allocates a new number to the competitor based on what is available in that section
  • System moves competitor to new section
  • System displays a confirmation message that the movement is successful
  • Case 9 Register by section

The system should enable a RSSS staff member to bring up a section and display the list of competitors to be registered for a particular event.

The section where competitors will be registered

Expected Output and Pass/Fail criteria

The list of competitors who will be registered. The pass/fail criteria are; competitor is not in list, incorrect or incomplete information, and incomplete fields should be highlighted.

  • User selects the target section
  • A list of competitors is displayed for that section
  • User selects a competitor from the list and marks them as registered
  • Prompt screen is displayed for the required information for that section
  • User enters the information
  • System stores the information and marks the competitor as “Registered”
  • Case 10 User Log on, admin

To enable the administrator to perform changes by logging in to the system.

Password

Administrator panel. The pass/fail criteria is that the password should be correct

  • User opens application
  • A list of roles available is displayed
  • User selects their role
  • Password screen is displayed
  • User enters password
  • Password is authenticated and admin screen displayed
  • Case 11 User Login, not admin

This enables competitors to log in to the OBMS to view their customized screens according to their roles.

No inputs are required.

Expected Output and Pass/Fail criteria

The user’s customized screen. Pass/fail criteria is that if the user chooses the admin option, they will be prompted for a password.

  • User opens application
  • A list of available roles is displayed
  • User selects their role
  • A welcome screen is displayed
  • Case 12 Data backup/restore

This test is for fulifilling the function of capturing the guide book details from the office system into the OBMS database

Guide book document in word format

All the information from the word document should be input into the back up media. No pass/fail criteria exist.

  • User accesses back up utility and chooses an output file
  • A back up file is automatically generated for download
  • The user places file in a backup media
  • Case 13 Restore data

The administrator should perform  a data backup/restore function in case data has been lost from the system.

Inputs

Input file.

The restored data. No pass/fail criteria exists.

  • User accesses the back up utility and chooses an input file to restore
  • A restore function is automatically performed by system. Case 14 Display competitor information

The system should enable a RSSS staff member to view information concerning a particular competitor during an event.

Guide book document in word format

All the information from the word document should be input into the OBMS database. The pass criteria is that the database should reflect the input information. Otherwise it is a failure.

  • User navigates to competitor search page
  • System searches and brings up a list of possible matches
  • User selects the appropriate match
  • The system displays the competitor’s information.

References

Bourque, P., & Fairley, R. E. (2014). Guide to the software engineering body of knowledge

(SWEBOK (R)): Version 3.0. IEEE Computer Society Press.

De Lemos, R., Giese, H., Müller, H. A., Shaw, M., Andersson, J., Litoiu, M., … & Weyns, D.

(2013). Software engineering for self-adaptive systems: A second research roadmap. In Software Engineering for Self-Adaptive Systems II (pp. 1-32). Springer Berlin Heidelberg.

Pressman, R. S. (2015). “Software Engineering–A Practitioner? s Approach”, Mc Graw-Hill

International Edition, 2010.

 Vij, R., McClure, D. K., & Ekaireb, M. (2014). U.S. Patent No. 8,676,529. Washington, DC:

U.S. Patent and Trademark Office.

Wohlin, C., Runeson, P., Höst, M., Ohlsson, M. C., Regnell, B., & Wesslén, A. (2012).

Experimentation in software engineering. Springer Science & Business Media.