Lecture_8-CSCIU511-Jahangir_Majumder-Spring_2025

Lecture Information

  • Course: CSCI U511 01 Operating Systems

  • Instructor: AKM Jahangir A Majumder, PhD

  • Date: February 6, 2025

  • Lecture Number: 8

  • Note: Slides adapted from previous instructors and textbook figures included.

Review and Learning Outcomes

  • Completed Topics: Concurrency and Threads.

  • Current Focus: Synchronization, including:

    • Single threaded approaches

    • Multithreaded approaches

    • Implementation of threads

    • Multi-threaded OS Kernel

    • Thread context switch

    • Race conditions

    • Locks

  • Reminders:

    • Homework 2 key posted on Blackboard.

    • Quiz 2 covering lectures 3-5 scheduled today.

Thread Synchronization Overview

  • Definition: Synchronization ensures that concurrent accesses to variables yield deterministic outcomes by enforcing the order of thread execution.

  • Allowed: Multiple concurrent reads.

  • Disallowed:

    • Multiple concurrent writes (outcome non-deterministic).

    • One write, multiple reads (outcome non-deterministic).

  • Goal: Ensure deterministic outcomes, especially in threaded programs where threads interact and share variable access.

Example of Potential Panic Condition

  • Threads Illustrated:

    • Thread 1: Executes the computation, sets pInitialized = true.

    • Thread 2: Loops while checking pInitialized, then computes q using p.

  • Issue: Compilers and hardware may reorder instructions to enhance performance, risking incorrect computation by executing 'pInitialized = true' prematurely.

Race Condition Explained

  • Definition: A race condition occurs when the output of a concurrent program is dependent on the order of operations between threads.

  • Mutual Exclusion: Ensures that only one thread executes a code section at a time.

  • Critical Section: A block of code that must be executed by one thread exclusively.

    • Lock Mechanism:

      • Acquire lock before entering critical section.

      • Unlock after exiting critical section.

      • Waiting occurs if the lock is already held by another thread.

Critical Section Structure

  • Structure:

    • Entry Section: Acquire lock before critical section. Wait if the lock is held.

    • Critical Section Code: Executed by the thread when it holds the lock.

    • Exit Section: Release the lock after the critical section.

Example of Race Conditions

  • Thread 1 and Thread 2:

    • Thread 1 modifies x to 1.

    • Thread 2 modifies x to 2.

    • Possible Outcomes: The final value of x can be either 1 or 2 based on which thread completes first.

  • Complex Case:

    • Thread 1 sets x to y + 1.

    • Thread 2 modifies y to y * 2 initially at y = 12.

    • Possible Final Values of x: 13 or 25, depending on execution order.

Too Much Milk Example

Scenario Overview

  • People Involved: Person A and Person B from 12:30 to 1:00 PM, concerning milk availability.

  • Actions: Both individuals check the fridge, leave for the store, but result in both gathering milk unnecessarily.

Attempted Solutions

  • Try #1: Leaving a note system to prevent both from buying milk.

    • Flaw: Either thread may act on the same condition simultaneously, violating exclusivity.

  • Try #2: More robust note system with checks for other notes.

    • Drawback: Possible starvation where one thread might wait indefinitely.

  • Try #3: Conditional checks prevent premature buying.

Building Concurrent Programs

  • Components: Multi-threaded programs depend on:

    • Shared objects developed with state variables and synchronization.

    • Atomic instructions for reading/writing memory states without interrupting.

Locks in Mutual Exclusion

  • Lock Operations:

    • Lock::acquire(): Wait for a free lock to access shared data.

    • Lock::release(): Mark lock as free and wake waiting threads.

  • States of Lock: BUSY or FREE

  • Safety and Progress: At most one thread holds the lock, ensuring no deadlocks.

Example of Lock Use in Memory Management

  • Memory Functions:

    • malloc(): Locks when allocating memory and releases afterward.

    • free(): Locks during deallocation, ensuring exclusive access.

Rules for Using Locks

  • Lock State: Always start in FREE state.

  • Workflow:

    • Acquire before accessing shared data structure at the beginning of the procedure.

    • Release after operations are complete at the end of the procedure.

  • Access Awareness: Only the lock holder should release it; do not pass locks to others.

  • Danger of Concurrent Access: Avoid accessing shared data without a lock.

Upcoming Topics

  • Further discussion on Synchronization.

  • Reading Materials: Refer to Textbook Chapters 5.1 - 5.3 for detailed study.