A………………………………………………………………………………………………………………………………………………………………
Accessibility: (Testing)Looks at an application to ensure that those with disabilities will be able to use the application as an able individual.
Agile: A development method, involving the creation of software, with the contributing parties, including testing, all working on the same item at the same time.
Alpha: (Testing)Testing of an application which is in production, but not available for general use. This is normally performed by users internal to the business.
Analyst: Person who interacts with the business in order to understand and document their requirements for an application or change to an existing one.
Analyst (Test): Person responsible for the preparation and execution of test scripts, recording and progressing of defects, reporting into the Test Team Leader or Test Manager.
Automation: The process of developing a script or program which can be run by software rather than a Test Engineer, to test an aspect of an application. Automation is used to increase the speed of test execution.
Automation Centre: A tool used in the automation of software testing. See
https://h10078.www1.hp.com/cda/hpms/display/main/hpms_content.jsp?zn=bto&cp=1-11-127-24%5E1074_4000_100__&jumpid=reg_R1002_USENB………………………………………………………………………………………………………………………………………………………………
Beta: (Testing)Testing of an application which is in production, but not available for general use. This is normally performed by a select group of friendly external testers.
Black Box: The process of testing without understanding the internal workings of an application, relying on inputs and outputs only.
Bug: See Defect
Business Requirement Specification: See Requirement Specification
C………………………………………………………………………………………………………………………………………………………………
Case: See Scenario
Code: The software that has been produced by development and is being subjected to testing.
Completion Report: A document produced during the testing and presented to the project manager on completion of testing activity. The document should detail the tests that have been performed along with the associated results and defects. Some reports may include a recommendation for the applications suitability for release to production.
Configuration Management: The means of managing the components required to make a larger item. Applicable to files, documents, data, hardware, software, tools. Understanding of the versions of each component required in order to be able to re-build the larger item.
Criteria: See Entry Criteria and Exit Criteria
D………………………………………………………………………………………………………………………………………………………………
Data: Information pertaining to the use of a system or recorded through the use of a system. Data is used in order to test a system, potentially after data cleansing if personal information is involved.
Data Generator: A tool used to generate high volumes of data in order to be able to test many permutations, or to load test an item.
Data Scrambling: The process of altering personal information from data to be used for testing purposes.
Defect: Where the item under test has been found to inaccurate, resulting from testing. Primarily used in associated with software, but equally valid for static testing of documentation.
Defect Detection: The means of identifying a defect. Can be a metric used to predict the volume of defects expected during the course of a project, or as a means of looking back at a project to understand where testing needs to be concentrated in future projects of a similar nature.
Defect Removal Efficiency: A metric used to assess the ability of testing to remove defects as they are introduced, during the software development life-cycle, keeping the cost of testing later phases to a minimum.
Defect Turnaround: The time taken from the identification of a defect, through to the point of resolution. Different levels of granularity may be used. e.g. A focus on the time taken by development.
Developer: A person responsible for the development of an application.
Development: A process of producing an application by production of low level design, code, unit testing and integration in the small testing.
Dynamic: Testing which occurs on the right hand side of the V-model, with the application present.
E………………………………………………………………………………………………………………………………………………………………
Environment: The combination of hardware, software and data as used in development, testing and production. The platform/s upon which the testing occurs.
Entry Criteria: The criteria that must be met prior to a deliverable entering the next phase of testing to another. This is normally associated with documented test assets and pre-agreed volumes of defects.
Error: See Defect
Exit Criteria: The criteria that must be met prior to a deliverable leaving the current phase of testing. This is normally associated with documented test assets and pre-agreed volumes of defects.
Execution: The process of working through a script on the application under test, in the testing environment.
F………………………………………………………………………………………………………………………………………………………………
Functional: (Testing) The testing of a products function, against requirements and design.
Functional Specification: A document which extracts all of the functional requirements from the requirement specification.
G………………………………………………………………………………………………………………………………………………………………
Glass Box: See White Box
Grey Box: Testing performed by testers with some knowledge of the internal workings of an application. See also Black Box and White Box testing.
H………………………………………………………………………………………………………………………………………………………………
High Level Design: A design showing the major components and how these will interface to each other, defining the hardware to be used and the software that will be impacted.
I………………………………………………………………………………………………………………………………………………………………
Integration in the Large: Where the application or applications that have been developed are brought together along with those which have remained unchanged, building a production like system around the application/s. Testing is then applied looking at the communication between the different applications.
Integration in the Small: Where the components of the application that have been developed are brought together along with those which have remained unchanged, building the application or major component of a single application. Testing is then applied looking at the communication between the different components.
Integration: The act of bringing many parts together to form a whole.
ISEB: Information Systems Examination Board. This was historically the board that was used to certify test professionals at either Foundation (Entry Level) or Practitioner Level (Advanced). See:
http://www.bcs.org/server.php?show=nav.5732ISTQB: International Software Testing Qualifications Board. See:
http://www.istqb.org/J………………………………………………………………………………………………………………………………………………………………
K………………………………………………………………………………………………………………………………………………………………
Key Performance Indicator: A mechanism built on one or metrics, which determines a band of acceptable performance, which over time is often targeted towards improvement.
L………………………………………………………………………………………………………………………………………………………………
Load: One of the types of performance testing, this looks at testing for the maximum load on the system.
Load Runner: Tool used to performance test one or many applications, to understand how it handles increases in load. See:
https://h10078.www1.hp.com/cda/hpms/display/main/hpms_content.jsp?zn=bto&cp=1-11-15-17%5e8_4000_100__Low Level Design: Definition of exactly how the application/s will be modified or produced in order to meet the requirements and the high level design. This can extend in some examples to elements of pseudo code being defined.
M………………………………………………………………………………………………………………………………………………………………
Metric: A measure of an attribute or occurrence in connection with an organisation, department or functions performance.
N………………………………………………………………………………………………………………………………………………………………
Non-functional: How an application performs, rather than how it does it.
Non-functional Specification: A document which details the non-functional requirements such as performance, security, operational elements.
O………………………………………………………………………………………………………………………………………………………………
P………………………………………………………………………………………………………………………………………………………………
Performance Centre: A tool used for measuring the performance of an application or series of applications. See:
https://h10078.www1.hp.com/cda/hpms/display/main/hpms_content.jsp?zn=bto&cp=1-11-126-17_4000_100__Performance: Used to describe many types of testing relating to the speed of an application. See Volume, Load and Stress.
Plan: A document produced for a type of testing defining how the testing will be performed, when the testing will be performed and who will be performing it. The test plan lists the test scenarios that will be scripted.
Preparation: The process of generating the test scripts.
Priority: The importance of fixing a defect from a business perspective. Defined by business representatives.
Q………………………………………………………………………………………………………………………………………………………………
Quality: The suitability of an item for its intended purpose and how well it achieves this.
Quality Centre: Tool used to assist with the management of testing, recording and tracking scripts, logging and tracking defects and more. See:
https://h10078.www1.hp.com/cda/hpms/display/main/hpms_content.jsp?zn=bto&cp=1-11-127-24_4000_100__R………………………………………………………………………………………………………………………………………………………………
Regression: Retesting of a previously tested program following modification to ensure that faults have not been introduced or uncovered as a result of the changes made
Requirement Specification: A document normally produced by a business analyst, capturing the needs of the individual in a manner which means that they can be translated into a software solution.
Re-Test: Taking a defect which has failed and executing the associated test script again.
S………………………………………………………………………………………………………………………………………………………………
Scenario: A high level description of how a requirement is going to be tested.
Script: Referable back to the scenario, the script defines each step that must be passed through in order to perform a test, along with the expected results. As the script is executed, results are recorded and if they match the expected result are marked as passed, otherwise as failed. A script containing a failure should have a resultant defect raised.
Schedule: A document, similar to a project plan, but detailing the activities associated with testing, when they are due to occur and who will be performing them.
Severity: The importance of a defect to testing and the application. Defined by Testers
Smoke: A process of proving an application or environment is ready to commence full testing, by running a sample set of scripts testing the primary functionality and/or connectivity.
Static: The process of testing without the presence of software. Normally refers to the testing of documentation.
Strategy: A document produced at project or programme level, defining what testing is to be performed.
Stress: A form of performance testing, whereby the volume of testing is increased to a point at which the application is deemed to be failing to perform, either due to failure to meet performance criteria or system breakdown.
T………………………………………………………………………………………………………………………………………………………………
Technical Architecture Document: Definition of how the business requirements should be translated into a technical solution at the highest level.
Test Director: A tool used for test management, capture of requirements, scripts and defects, with the ability to manage defects through to resolution. Now replaced by Quality Centre.
Testing: The process of reviewing an item for accuracy and its ability to complete the desired objective.
U………………………………………………………………………………………………………………………………………………………………
Unit: The testing of the application by the developers at a component or modular level, looking at whether the code is performing as expected.
User Acceptance: The means of testing applied, normally by members of the business or recipients of the system, whilst still in the test environment. This looks to ensure that business requirements have been understood by all involved in the application production as well as providing an opportunity to check the business processes that the application will be used in conjunction with.
V………………………………………………………………………………………………………………………………………………………………
V-Model: A testing focussed software development life-cycle where the process of creativity occurs down the left side of the V and the process of testing up the right side of the V, where each step down the V has a corresponding point of testing up the right of the V. Begins with a requirement and terminates with User Acceptance Testing.
Version: An alpha numeric means of identifying a particular build of software, or a component thereof. The version changes incrementally to reflect each modification of the software, component, document etc.
Volume: (Testing) A type of performance testing which increases the volume of users on a system, in each cycle, taking the volume up to a prescribed limit, normally exceeding optimum load.
W………………………………………………………………………………………………………………………………………………………………
W3C: World Wide Web Consortium – body creating web standards and guidelines.
http://www.w3.org/WAI: Web Accessibility Initiative. See:
http://www.w3.org/WAI/Waterfall: Project Management Method whereby each element of the software development life-cycle occurs sequentially.
Win Runner: A tool that was historically used for test automation. See Automation Centre.
White Box: Testing normally performed by developers, looking at the code and ensuring that each model of code is performing as expected. See Unit
X………………………………………………………………………………………………………………………………………………………………
Y………………………………………………………………………………………………………………………………………………………………
Z………………………………………………………………………………………………………………………………………………………………