ISTQB CTFL v4.0 Terms

0.0(0)
studied byStudied by 6 people
0.0(0)
full-widthCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/219

flashcard set

Earn XP

Description and Tags

Terms for the ISTQB CTFL v4.0 exam

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

220 Terms

1
New cards

Acceptance Criteria

The criteria that a component or system must satisfy in order to be accepted by a user, customer, or other authorized entity.

2
New cards

Acceptance Test-Driven Development

A collaboration-based test-first approach that defines acceptance tests in the stakeholders’ domain language.

3
New cards

Acceptance Testing

A test level that focuses on determining whether to accept the system.

4
New cards

Acceptance Testing

A test level that focuses on determining whether to accept the system.

5
New cards

Actual Result

The behavior produced / observed when a component or system is tested.

6
New cards

Agile Software Development

A group of software development methodologies based on iterative incremental development, where requirements and solutions evolve through collaboration between cross-functional teams.

7
New cards

Alpha Testing

A type of acceptance testing performed in the developer’s test environment by roles outside the development organization.

8
New cards

Anomaly

A condition that deviates from expectation.

9
New cards

API Testing

Testing performed by submitting requests to the test object using its application programming interface.

10
New cards

Audit

An independent examination of a work product or process performed by a third party to assess whether it complies with specifications, standards, contractual agreements, or other criteria.

11
New cards

Availability

The degree to which a component or system is operational and accessible when required for use.

12
New cards

Behavior-Driven Development

A collaborative approach to development in which the team is focusing on delivering expected behavior of a component or system for the customer, which forms the basis for testing.

13
New cards

Beta Testing

A type of acceptance testing performed at an external site to the developer’s test environment by roles outside the development organization.

14
New cards

Black-Box Test Technique

A test technique based on an analysis of the specification of a component or system.

15
New cards

Black-Box Testing

Testing based on analysis of the specification of the component or system.

16
New cards

Boundary Value

A minimum or maximum value of an ordered equivalence partition.

17
New cards

Boundary Value Analysis

A black-box test technique in which test cases are designed based on boundary values.

18
New cards

Branch

A transfer of control between two nodes in the control flow graph of a test item.

19
New cards

Branch Coverage

The coverage of branches in a control flow graph.

20
New cards

Branch Testing

A white-box test technique in which the test conditions are branches.

21
New cards

Cause-Effect Diagram

A graphical representation used to organize and display the interrelationships of various possible root causes of a problem. Possible causes of a real or potential defect or failure are organized in categories and subcategories in a horizontal tree-structure, with the (potential) defect or failure as the root node.

22
New cards

Checklist-Based Review

A review technique guided by a list of questions or required attributes.

23
New cards

Checklist-Based Testing

An experience-based test technique in which test cases are designed to exercise the items of a checklist.

24
New cards

Coding Standard

A standard that describes the characteristics of a design or a design description of data or program components.

25
New cards

Collaboration-Based Test Approach

An approach to testing that focuses on defect avoidance by collaborating among stakeholders.

26
New cards

Compatibility

The degree to which a component or system can exchange information with other components or systems, and/or perform its required functions while sharing the same hardware or software environment.

27
New cards

Complexity

The degree to which the design or code of a component or system is difficult to understand.

28
New cards

Compliance

Adherence of a work product to standards, conventions or regulations in laws and similar prescriptions.

29
New cards

Component

A part of a system that can be tested in isolation.

30
New cards

Component Integration Testing

The integration testing of components.

31
New cards

Component Testing

A test level that focuses on individual hardware or software components.

32
New cards

Configuration Management

A discipline applying technical and administrative direction and surveillance to identify and document the functional and physical characteristics of a configuration item, control changes to those characteristics, record and report change processing and implementation status, and verify that it complies with specified requirements

33
New cards

Confirmation Testing

A type of change-related testing performed after fixing a defect to confirm that a failure caused by that defect does not reoccur.

34
New cards

Continuous Integration

An automated software development procedure that merges, integrates and tests all changes as soon as they are committed.

35
New cards

Continuous Testing

An approach that involves a process of testing early, testing often, test everywhere, and automate to obtain feedback on the business risks associated with a software release candidate as rapidly as possible.

36
New cards

Control Flow

The sequence in which operations are performed by a business process, component or system.

37
New cards

Cost of Quality

The total costs incurred on quality activities and issues and often split into prevention costs, appraisal costs, internal failure costs and external failure costs.

38
New cards

Coverage

The degree to which specified coverage items are exercised by a test suite, expressed as a percentage.

39
New cards

Coverage Criteria

The criteria to define the coverage items required to reach a test objective.

40
New cards

Coverage Item

An attribute or combination of attributes derived from one or more test conditions by using a test technique.

41
New cards

Dashboard

A representation of dynamic measurements of operational performance for some organization or activity, using metrics represented via metaphors such as visual dials, counters, and other devices resembling those on the dashboard of an automobile, so that the effects of events or activities can be easily understood and related to operational goals.

42
New cards

Debugging

The process of finding, analyzing and removing the causes of failures in a component or system.

43
New cards

Decision Table Testing

A black-box test technique in which test cases are designed to exercise the combinations of conditions and the resulting actions shown in a decision table.

44
New cards

Defect

An imperfection or deficiency in a work product where it does not meet its requirements or specifications.

45
New cards

Defect Density

The number of defects per unit size of a work product.

46
New cards

Defect Detection Percentage

The number of defects found by a test level, divided by the number found by that test level and any other means afterwards.

47
New cards

Defect Management

The process of recognizing, recording, classifying, investigating, resolving and disposing of defects.

48
New cards

Defect Report

Documentation of the occurrence, nature, and status of a defect.

49
New cards

Driver

A component or tool that temporarily replaces another component and controls or calls a test item in isolation.

50
New cards

Dynamic Testing

Testing that involves the execution of the test item.

51
New cards

Effectiveness

The extent to which correct and complete goals are achieved.

52
New cards

Efficiency

The degree to which resources are expended in relation to results achieved.

53
New cards

Entry Criteria

The set of conditions for officially starting a defined task.

54
New cards

Equivalence Partition

A subset of the value domain of a variable within a component or system in which all values are expected to be treated the same based on the specification.

55
New cards

Equivalence Partitioning

A black-box test technique in which test conditions are equivalence partitions exercised by one representative member of each partition.

56
New cards

Error

A human action that produces an incorrect result.

57
New cards

Error Guessing

A test technique in which tests are derived on the basis of the tester's knowledge of past failures, or general knowledge of failure modes.

58
New cards

Exhaustive Testing

A test approach in which the test suite comprises all combinations of input values and preconditions.

59
New cards

Exit Criteria

The set of conditions for officially completing a defined task.

60
New cards

Expected Result

The observable predicted behavior of a test item under specified conditions based on its test basis.

61
New cards

Experience-Based Test Technique

A test technique based on the tester's experience, knowledge and intuition.

62
New cards

Exploratory Testing

An approach to testing in which the testers dynamically design and execute tests based on their knowledge, exploration of the test item and the results of previous tests.

63
New cards

Failed

The status of a test result in which the actual result does not match the expected result.

64
New cards

Failure

An event in which a component or system does not perform a required function within specified limits.

65
New cards

Feature-Driven Development

An iterative and incremental software development process driven from a client-valued functionality (feature) perspective. This process is mostly used in Agile software development.

66
New cards

Finding

A result of an evaluation that identifies some important issue, problem, or opportunity.

67
New cards

Formal Review

A type of review that follows a defined process with a formally documented output.

68
New cards

Functional Appropriateness

The degree to which the functions facilitate the accomplishment of specified tasks and objectives.

69
New cards

Functional Completeness

The degree to which the set of functions covers all the specified tasks and user objectives.

70
New cards

Functional Correctness

The degree to which a component or system provides the correct results with the needed degree of precision.

71
New cards

Functional Testing

Testing performed to evaluate if a component or system satisfies functional requirements.

72
New cards

Heuristic

A generally recognized rule of thumb that helps to achieve a goal.

73
New cards

Impact Analysis

The identification of all work products affected by a change, including an estimate of the resources needed to accomplish the change.

74
New cards

Incremental Development Model

A type of software development lifecycle model in which the component or system is developed through a series of increments.

75
New cards

Independence of Testing

Separation of responsibilities, which encourages the accomplishment of objective testing.

76
New cards

Informal Review

A type of review that does not follow a defined process and has no formally documented output.

77
New cards

Inspection

A type of formal review that uses defined team roles and measurement to identify defects in a work product, and improve the review process and the software development process.

78
New cards

Integration Testing

A test level that focuses on interactions between components or systems.

79
New cards

Integrity

The degree to which a component or system allows only authorized access and modification to a component, a system or data.

80
New cards

Iterative Development Model

A type of software development lifecycle model in which the component or system is developed through a series of repeated cycles.

81
New cards

Maintainability

The degree to which a component or system can be modified by the intended maintainers.

82
New cards

Maintenance

The process of modifying a component or system after delivery to correct defects, improve quality characteristics, or adapt to a changed environment.

83
New cards

Maintenance Testing

Testing the changes to an operational system or the impact of a changed environment to an operational system.

84
New cards

Maturity

(1) The capability of an organization with respect to the effectiveness and efficiency of its processes and work practices.

(2) The degree to which a component or system meets needs for reliability under normal operation.

85
New cards

Mean Time To Failure

The average time from the start of operation to a failure for a component or system.

86
New cards

Measurement

The process of assigning a number or category to an entity to describe an attribute of that entity.

87
New cards

Metric

A measurement scale and the method used for measurement.

88
New cards

Moderator

(1) The person responsible for running review meetings.

(2) The person who performs a usability test session.

89
New cards

N-Switch Coverage

The coverage of sequences of N+1 transitions.

90
New cards

Negative Testing

Testing a component or system in a way for which it was not intended to be used.

91
New cards

Neuron Coverage

The coverage of activated neurons in the neural network for a set of tests.

92
New cards

Non-Functional Testing

Testing performed to evaluate that a component or system complies with non-functional requirements.

93
New cards

Operational Acceptance Testing

A type of acceptance testing performed to determine if operations and/or systems administration staff can accept a system.

94
New cards

Pair Testing

A test approach in which two team members simultaneously collaborate on testing a work product.

95
New cards

Passed

The status of a test result in which the actual result matches the expected result.

96
New cards

Path

A sequence of consecutive edges in a directed graph.

97
New cards

Performance of Efficiency

The degree to which a component or system uses time, resources and capacity when accomplishing its designated functions.

98
New cards

Planning Poker

A consensus-based estimation technique, mostly used to estimate effort or relative size of user stories in Agile software development. It is a variation of the Wideband Delphi method using a deck of cards with values representing the units in which the team estimates.

99
New cards

Portability

The degree to which a component or system can be transferred from one hardware, software or other operational or usage environment to another.

100
New cards

Postcondition

The expected state of a test item and its environment at the end of test case execution.