Software Engineering 2: Software Testing

0.0(0)
studied byStudied by 4 people
0.0(0)
full-widthCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/77

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

78 Terms

1
New cards

Testing

Intended to demonstrate that a program performs as designed and to identify program defects before it is deployed.

2
New cards

Testing

When you do this to a software, you execute a program using artificial data.

Results are analyzed for errors, anomalies, and non-functional attributes.

3
New cards

Testing

Its limitation is it can reveal the presence of errors, but not guarantee their absence.

4
New cards

Role in Software Development

Part of the broader verification and validation (V&V) process.

Complements static validation techniques (e.g., code reviews, inspections).

5
New cards

Program Testing Goals

To demonstrate to the developer and the customer that the software meets its requirements.

To discover situations in which the behavior of the software is incorrect, undesirable or does not conform to its specification.

6
New cards

Ensure functionality under normal use

The first goal leads to validation testing

7
New cards

Focus on defect detection

The second goal leads to defect testing

8
New cards

Validation Testing

To demonstrate to the developer and the system customer that the software meets its requirements.

A successful test shows that the system operates as intended.

9
New cards

Defect Testing

To discover faults or defects in the software where its behavior is incorrect or not in conformance with its specification.

A successful test is a test that makes the system perform incorrectly and so exposes a defect in the system.

10
New cards

System (The Box)

This is the program or software being tested. In a black-box view, its internal workings are not considered; only its external behavior matters.

11
New cards

Input test data

Data is sent into the system.

This corresponds to the inputs specified by the test cases

12
New cards

Ie (Input Environment/Expected Input)

Represents the source of the input data, often the tester or an external system that provides the necessary inputs.

13
New cards

Output Test Results

Data is produced by the system after processing the input.

This is the actual behavior of the system.

14
New cards

Software Verification

The process of checking that the software meets its stated functional and non-functional requirements.

15
New cards

Software Validation

A more general process that aims to ensure that the software meets the customer’s expectations.

16
New cards

Verification and Validation Confidence

Its main goal is to build confidence that the system is fit for purpose.

17
New cards

Confidence Factors of Verification and Validation Confidence

  1. Software Purpose: Critical systems require higher assurance.

  2. User Expectations: Some software may be expected to have limited functionality.

  3. Marketing Environment: Speed to market may outweigh thorough defect detection.

18
New cards

Inspections

Can check conformance with a specification, but not conformance with the customer's real requirements.

19
New cards

Inspections

Cannot check non-functional characteristics such as performance, usability, etc.

20
New cards

Inspections and testing

Complementary and not opposing verification techniques.

Both should be used during the V & V process.

21
New cards

Software Inspections

A static validation technique where people examine system representations—like requirements, design, or test data.

22
New cards

TESTING Planning

Schedules and allocates resources for all testing activities.

23
New cards

TESTING Planning

Defines the testing process, considering available people and time.

24
New cards

TESTING Planning

Creates a test plan that outlines: what will be tested, predicted testing schedule, how results will be recorded

25
New cards

Test Case

A documented set of conditions, inputs, and expected results used to check whether a software feature works correctly.

26
New cards

Development Testing

Three Stages of Testing.

Discover bugs and defects during development.

27
New cards

Release Testing

Three Stages of Testing.

Ensure a complete version of the system meets the stakeholder requirements before release.

28
New cards

User Testing

Three Stages of Testing.

Its goal is to test the system in the user's own environment to decide if it meets their needs.

29
New cards

Development Testing

Includes all testing activities that are carried out by the team developing the system.

30
New cards

Unit Testing

Where individual program units or object classes are tested.

31
New cards

Unit Testing

The process of testing individual components in isolation.

32
New cards

Unit Testing

Its purpose is to be a defect testing process aimed at finding bugs early.

33
New cards

Object Class Testing

Complete test coverage of a class involves: testing all operations associated with an object, setting and interrogating all object attributes, exercising the object in all possible states.

34
New cards

Inheritance

Makes it more difficult to design object class tests as the information to be tested is not localized.

35
New cards

Partition Testing, Guideline-based Testing

Two Strategies in Choosing Test Cases:

36
New cards

Partition Testing

Where you identify groups of inputs that have common characteristics and should be processed in the same way.

37
New cards

Guideline-based Testing

Where you use testing guidelines to choose test cases. These guidelines reflect previous experience of the kinds of errors that programmers often make when developing components.

38
New cards

Testing Guidelines: Sequences

  1. Test software with sequences that have only a single value.

  2. Use sequences of different sizes in different tests.

  3. Derive tests so that the first, middle, and last elements of the sequence are accessed.

  4. Test with sequences of zero length.

39
New cards

Component Testing

Where several individual units are integrated to create composite components.

40
New cards

Component Testing

Focus: Testing composite components, which are made up of several interacting objects.

41
New cards

Component Testing

Access: Functionality is accessed through a defined component interface.

42
New cards

Component Testing

Goal: To demonstrate that the component interface behaves according to its specification.

43
New cards

Component Testing

Prerequisite: Assumes unit tests on the individual objects within the component have already been completed successfully.

44
New cards

Interface testing

To detect faults resulting from.

45
New cards

Parameter Interfaces

Interface Types.

Data is passed directly between methods or procedures.

46
New cards

Shared Memory Interfaces

Interface Types.

A block of memory is directly shared between procedures or functions.

47
New cards

Procedural Interfaces

Interface Types.

A sub-system offers a set of procedures that other sub-systems call.

48
New cards

Message Passing Interfaces

Interface Types.

Sub-systems communicate by sending messages to request services from each other.

49
New cards

Interface Errors

Invalid assumptions about how interfaces work.

50
New cards

Interface Misuse

Common Types of Interface Errors.

The calling component makes a mistake when using the interface of the called component (e.g., passing parameters in the wrong order).

51
New cards

Interface Misunderstanding

Common Types of Interface Errors.

The calling component holds incorrect assumptions about the behavior or functionality of the called component.

52
New cards

Timing Errors

Common Types of Interface Errors.

Occur when the calling and called components operate at different speeds, leading to the use of out-of-date information.

53
New cards

System Testing

Where some or all of the components in a system are integrated and the system is tested as a whole.

54
New cards

System Testing

Focuses on verifying the integrated system by assembling components to create a working version.

55
New cards

System Testing

Its primary goal is to check the interactions between components, ensuring they are compatible, behave correctly, and transfer the appropriate data across their interfaces at the right time.

56
New cards

Basis for Testing

Use Cases for System Testing.

Use cases provide a structured foundation for system testing by identifying system interactions.

57
New cards

Forcing Interactions

Use Cases for System Testing.

Testing each use case naturally involves and forces interactions among several system components.

58
New cards

Documentation

Use Cases for System Testing.

Sequence diagrams that are part of the use case documentation clearly map out the specific components and interactions that are being tested.

59
New cards

Testing Policies

Developed to define the required system test coverage.

60
New cards

Testing Policies

These policies ensure critical areas are tested without checking every possibility.

61
New cards

Menu Access

Examples of common testing policies.

All system functions accessible via menus must be tested.

62
New cards

Function Combinations

Examples of common testing policies.

Combinations of functions (like text formatting) accessed via the same menu must be tested together.

63
New cards

Input Validation

Examples of common testing policies.

Where user input is involved, all functions must be tested with both correct and incorrect input.

64
New cards

Test-Driven Development

An incremental approach where testing and coding are tightly inter-woven.

65
New cards

Test-Driven Development

Its core principle is tests are written before the code.

66
New cards

Test-Driven Development

Its process includes developing code in small increments, writing a new test for each increment.

67
New cards

Test-Driven Development

Its critical driver for moving forward is ensuring the newly written code successfully 'passes' its test.

68
New cards

Identify Increment

TTD: Process Activities.

Determine a small piece of required functionality (an increment) that can be implemented quickly.

69
New cards

Write Failing Test

TTD: Process Activities.

Write and implement an automated test specifically for this new functionality. When run with existing tests, this new test must initially fail because the functionality hasn't been written yet.

70
New cards

Implement Functionality

TTD: Process Activities.

Write the necessary code to make the new test pass.

71
New cards

Re-run All Tests

TTD: Process Activities.

Execute the entire test suite (the new test and all previously implemented tests) to ensure everything works (regression testing).

72
New cards

Next Increment

TTD: Process Activities.

Once all tests run successfully, you proceed to the next small chunk of functionality, repeating the cycle.

73
New cards

Code Coverage

Benefits of Test-Driven Development.

Every code segment that you write has at least one associated test so all code written has at least one test.

74
New cards

Regression Testing

Benefits of Test-Driven Development.

75
New cards

Simplified Debugging

Benefits of Test-Driven Development.

When a test fails, it should be obvious where the problem lies. The newly written code needs to be checked and modified.

76
New cards

System Documentation

Benefits of Test-Driven Development.

The tests themselves are a form of documentation that describe what the code should be doing.

77
New cards

Simplified Debugging

When a test fails, it should be obvious where the problem lies. The newly written code needs to be checked and modified.

78
New cards

System Documentation

The tests themselves are a form of documentation that describe what the code should be doing.