Chapter 6: CPU Scheduling

0.0(0)
studied byStudied by 0 people
call kaiCall Kai
learnLearn
examPractice Test
spaced repetitionSpaced Repetition
heart puzzleMatch
flashcardsFlashcards
GameKnowt Play
Card Sorting

1/19

flashcard set

Earn XP

Description and Tags

These flashcards cover key terms and concepts from Chapter 6 on CPU Scheduling in Operating Systems.

Last updated 5:01 AM on 10/19/25
Name
Mastery
Learn
Test
Matching
Spaced
Call with Kai

No analytics yet

Send a link to your students to track their progress

20 Terms

1
New cards

CPU Scheduling

The process of determining which process in the ready queue will be allocated CPU time.

2
New cards

Throughput

The number of processes that complete their execution per time unit.

3
New cards

Turnaround Time

The total time taken from the submission of a process to the completion.

4
New cards

Waiting Time

The total time a process spends waiting in the ready queue.

5
New cards

Response Time

The time from when a request was submitted until the first response is produced.

6
New cards

Preemptive Scheduling

A scheduling strategy whereby a currently running process can be interrupted and moved to the ready state.

7
New cards

Non-preemptive Scheduling

A scheduling strategy where a running process cannot be interrupted until it finishes its current CPU burst.

8
New cards

First-Come, First-Served (FCFS)

A scheduling algorithm that serves processes in the order they arrive in the ready queue.

9
New cards

Shortest Job First (SJF)

A scheduling algorithm that selects processes based on the length of their next CPU burst.

10
New cards

Priority Scheduling

A scheduling method where each process is assigned a priority and the CPU is allocated to the highest priority process.

11
New cards

Round Robin (RR)

A preemptive scheduling algorithm that allocates a small unit of CPU time to each process in the ready queue.

12
New cards

Multilevel Queue Scheduling

A scheduling algorithm that partitions the ready queue into separate queues, each with its own scheduling algorithm.

13
New cards

Multilevel Feedback Queue

A scheduling algorithm that allows processes to move between different queues based on their execution history.

14
New cards

Thread Scheduling

A method to manage Threads in an operating system, with distinctions between user-level and kernel-level threads.

15
New cards

Dispatcher

A module that gives control of the CPU to the selected process by the short-term scheduler.

16
New cards

CPU-I/O Burst Cycle

The pattern of alternating periods of CPU execution and I/O wait in process execution.

17
New cards

Context Switch

The procedure of storing and restoring the state or context of a CPU so that multiple processes can share a single CPU resource.

18
New cards

Dispatch Latency

The time it takes for the dispatcher to stop one process and start another running.

19
New cards

Aging

A technique used in scheduling to prevent starvation by gradually increasing the priority of waiting processes.

20
New cards

Load Balancing

Distributing processes across multiple CPUs to ensure efficient CPU usage.