Software Testing CA2

0.0(0)
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/235

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

236 Terms

1
New cards

External Specifications

Define software requirements and expected behaviors.

2
New cards

Agile Development

Uses user stories for requirements instead of detailed specs.

3
New cards

Baseline Document

Describes expected system behavior for comparison.

4
New cards

Testing Oracle

Source to determine expected results for tests.

5
New cards

Expected Results

Defined outcomes before executing tests.

6
New cards

Exit Criteria

Triggers to confirm testing completion.

7
New cards

Coverage Achieved

Percentage of requirements tested, typically 80%.

8
New cards

Unit Testing

Testing individual components for correctness.

9
New cards

Integration Testing

Testing combined components for interaction issues.

10
New cards

System Testing

Testing the complete system for compliance.

11
New cards

Test Planning

Process of defining testing strategy and scope.

12
New cards

Test Case Generation

Creating specific scenarios to validate requirements.

13
New cards

User Stories

Informal descriptions of software features from user perspective.

14
New cards

Refactoring

Improving code without changing its external behavior.

15
New cards

Component Testing

Testing individual parts of a system in isolation.

16
New cards

Testers' Role

Identify tests needed and compare results with requirements.

17
New cards

Requirements Comparison

Assessing actual results against defined specifications.

18
New cards

Plausible Result

Erroneous result perceived as correct due to bias.

19
New cards

Critical Business Scenarios

Key functionalities that must be tested thoroughly.

20
New cards

Outstanding Incidents

Unresolved issues that may affect testing outcomes.

21
New cards

Test

Controlled exercise with defined inputs and outputs.

22
New cards

Expected Results

Derived from baseline; compares with actual results.

23
New cards

Actual Result

Outcome obtained from executing a test.

24
New cards

Test Process

Main activities include planning, execution, and closure.

25
New cards

Test Planning

Determines strategy implementation and test scope.

26
New cards

Test Control

Measures results and monitors testing progress.

27
New cards

Test Analysis

Derives test conditions from baseline documents.

28
New cards

Test Design

Sets up test environment and prepares test cases.

29
New cards

Test Implementation

Transforms conditions into test cases and data.

30
New cards

Test Execution

Runs tests and records actual results.

31
New cards

Incident Reports

Documents test failures for analysis and action.

32
New cards

Regression Testing

Re-testing to ensure previous functionality remains intact.

33
New cards

Exit Criteria

Conditions to assess completion of testing activities.

34
New cards

Test Closure Activities

Final report and review after testing phase.

35
New cards

Test Inventory

List of features and test cases to be executed.

36
New cards

Test Case Prioritization

Ranking test cases based on importance or risk.

37
New cards

Testware

Tools and resources used for testing activities.

38
New cards

Test Coverage

Extent to which testing addresses requirements.

39
New cards

Baseline Documents

Original documents used as reference for testing.

40
New cards

Pre-flight Checks

Initial checks before executing test cases.

41
New cards

Test Scripts

Detailed instructions for executing test cases.

42
New cards

Test Data

Specific data used during test execution.

43
New cards

Corrective Actions

Steps taken to address identified issues.

44
New cards

Post Implementation Reviews

Evaluations to learn from testing outcomes.

45
New cards

Defect Presence

Testing indicates defects exist, not absence.

46
New cards

Exhaustive Testing

Testing all paths and inputs is impractical.

47
New cards

Effective Testing

Select tests that efficiently find faults.

48
New cards

Early Testing

Start testing early in development cycle.

49
New cards

Defect Clustering

Few modules contain most pre-release defects.

50
New cards

Pesticide Paradox

Repeated tests stop finding new defects.

51
New cards

Context-Dependent Testing

Testing varies by software context and risk.

52
New cards

Absence-of-Errors Fallacy

Good tests don't guarantee user satisfaction.

53
New cards

Testing Goal

Find defects before customers discover them.

54
New cards

Test Understanding

Know the purpose behind each test.

55
New cards

Test Corrections

Fixes may introduce unintended side effects.

56
New cards

Defect Convoys

Defects often occur in clusters.

57
New cards

Defect Prevention

Testing includes preventing defects, not just finding.

58
New cards

Automation in Testing

Well-planned automation enhances testing benefits.

59
New cards

Team Commitment

Successful testing requires dedicated, skilled teams.

60
New cards

Risk Focus

Prioritize testing based on risk assessment.

61
New cards

Testing Objectives

Focus testing on defined, clear objectives.

62
New cards

Operational Failures

Modules with defects often show operational issues.

63
New cards

Testing Efficiency

Balance effectiveness and efficiency in testing.

64
New cards

Faulty Requirements

Defects arise from specifications not meeting user needs.

65
New cards

Test Design Techniques

Methods to identify test conditions and cases.

66
New cards

Equivalence Class Partitioning

Dividing inputs into groups treated equivalently.

<p>Dividing inputs into groups treated equivalently.</p>
67
New cards

Boundary Value Analysis

Testing at the edges of equivalence partitions.

68
New cards

Specification Based Techniques

Testing methods based on software specifications.

69
New cards

Exhaustive Testing

Testing all possible inputs and paths impractically.

70
New cards

Valid Partitions

Input ranges expected to be accepted by the system.

71
New cards

Invalid Partitions

Input ranges not expected by the system.

72
New cards

Test Case Strategy

Method to derive test cases from equivalence classes.

73
New cards

Equivalence Class

Group of inputs treated the same by the system.

74
New cards

Test Coverage

Extent to which test cases cover requirements.

75
New cards

Unique Identifier

Label assigned to each equivalence class.

76
New cards

Error Message Handling

System response to invalid input partitions.

77
New cards

Precision in Testing

Importance of exact values in specific domains.

78
New cards

Boundary Cases

Values just above, below, and on the boundary.

79
New cards

BS5925-2

Standard for boundary value analysis techniques.

<p>Standard for boundary value analysis techniques.</p>
80
New cards

Equivalence Partitioning Example

Identifying valid and invalid input ranges.

81
New cards

Default Condition

Assumed value when no input is supplied.

82
New cards

Empty Condition

Value exists but contains no content.

83
New cards

Blank Condition

Value exists with content but is whitespace.

84
New cards

Null Condition

Value does not exist or is unallocated.

85
New cards

Zero Condition

Numeric value of zero in input.

86
New cards

None Condition

No selection made from a list.

87
New cards

Fault Detection

Identifying errors through strategic test cases.

88
New cards

Subjective Boundary Selection

Choosing boundaries can vary based on interpretation.

89
New cards

Test Case Annotation

Documenting equivalence class identifiers for test cases.

90
New cards

Brute-force Testing

Testing every combination of inputs impractically.

91
New cards

Software Process

Methodology for producing software products.

92
New cards

Software life-cycle model

Framework for managing software development phases.

93
New cards

Software Development Life Cycle (SDLC)

Phases involved in software development.

<p>Phases involved in software development.</p>
94
New cards

Build-and-fix model

Ad-hoc approach without specifications or design.

95
New cards

Waterfall model

Sequential phases where each phase is signed-off.

96
New cards

Iterative development models

Repeated cycles of requirements, design, and testing.

97
New cards

Incremental Model

Gradually adds functionality to a system.

98
New cards

V-Model

Verification and validation model for software testing.

99
New cards

Continuous Software Engineering (CSE)

Integration of continuous processes in software development.

<p>Integration of continuous processes in software development.</p>
100
New cards

Regression testing

Testing existing functionality after changes.