Understanding Software Performance Testing Techniques

0.0(0)
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
Card Sorting

1/40

encourage image

There's no tags or description

Looks like no tags are added yet.

Study Analytics
Name
Mastery
Learn
Test
Matching
Spaced

No study sessions yet.

41 Terms

1
New cards

Key Performance Indicators (KPIs)

Metrics that help assess the performance of a system.

2
New cards

Availability

The amount of time an application is available to the end user.

3
New cards

Throughput

Number of application-oriented events the software can process (within a given time).

4
New cards

Capacity Utilization

The percentage of the capacity of a resource that is being used.

5
New cards

Baseline Testing

Establish a point of comparison for further tests by executing a single transaction as a single user.

6
New cards

Load Testing

Understand the behavior of the system under a normal/expected load.

7
New cards

Efficiency-oriented indicators

Metrics that focus on the effective use of resources in the application.

8
New cards

Service-oriented indicators

Metrics that focus on the service availability and performance from the user's perspective.

9
New cards

Undue perceived delay

A delay experienced by the user that is considered excessive or frustrating.

10
New cards

Seamless manipulation

User interaction with the application that feels instantaneous, typically <= 0.1s.

11
New cards

Free navigation

User experience where interactions feel fluid and responsive, typically <= 1s.

12
New cards

User switch/interrupt operation

The point at which users may abandon an operation due to delays, typically <= 10s.

13
New cards

Application-oriented events

Specific actions or requests that the software processes, such as user requests.

14
New cards

Network bandwidth consumption

The amount of data transmitted over a network by application traffic.

15
New cards

Memory usage

The amount of memory consumed when a specific number of visitors are active.

16
New cards

Normal conditions

The standard operating environment under which baseline tests are conducted.

17
New cards

Maximum load

The highest level of demand the application can handle before failing or behaving unexpectedly.

18
New cards

Database capacity

The amount of data the database can handle before experiencing delays or crashes.

19
New cards

Network-related issues

Problems that affect the performance of the application due to network conditions.

20
New cards

Soak Testing

Identify performance problems that appear over extended periods of time by supplying expected load to the application continuously.

21
New cards

Stress Testing

See how the application performs under unfavorable conditions and ensure it fails and recovers gracefully by overwhelming system resources.

22
New cards

Spike Testing

Make sure the system can handle bursts of higher user or system activity, focusing on high load for a short duration.

23
New cards

Performance Testing Process

1. Decide on the testing environment 2. Identify performance metrics 3. Plan and design performance test 4. Configure the test environment 5. Implement the tests 6. Execute tests 7. Analyze, report, retest.

24
New cards

Testing Environment

What kind of environment can you afford? Options include a subset of production system with lower-spec servers, replica of production, or actual production.

25
New cards

Latency

How long it takes to receive the first byte after a request is sent.

26
New cards

Error Rate

Percentage of requests resulting in errors.

27
New cards

Concurrent Users

How many active users at any point in time, which is the most common measure of load.

28
New cards

CPU Utilization

The percentage of CPU resources being used.

29
New cards

Memory Utilization

The percentage of memory resources being used.

30
New cards

Performance Test Checkpoints

Specific points of interest during a test, such as logging into a website, clicking checkout, processing payment information, and showing final confirmation.

31
New cards

Test Environment Configuration

Ideally, automate the creation and tear down of the testing environment to allow repeatability using tools like Chef, Ansible, Spinnaker, or Heroku.

32
New cards

Response Time

Measured in seconds, indicates how long it takes for a system to respond.

33
New cards

Elapsed Time

Measured in seconds, refers to the total time taken for a process to complete.

34
New cards

Dynamic Analysis

Involves instrumentation/profiling to analyze performance.

35
New cards

Profiling

Uses tools to instrument code and record performance information at runtime.

36
New cards

Continuous Integration

Profiling can be integrated into this process, though it may slow down test execution.

37
New cards

Performance Code Review

Involves looking for inefficient coding patterns and resource management.

38
New cards

Inefficient Coding Patterns

Examples include repeated calls to a database inside a loop.

39
New cards

Parallelism Potential vs. Overhead

Overhead of splitting execution may outweigh performance gains.

40
New cards

Method Concatenation

Inefficient pattern of concatenating strings using '+' in a loop.

41
New cards

Performance Testing Tools

Various tools are available to assist in performance testing.