PA

lecture29 - Memory 4 - Clock_ Memory Allocation_default

LRU Implementation in Software

Concept of LRU

  • LRU stands for Least Recently Used page replacement algorithm.

  • Idea: Keep track of page references to optimize memory management.

Suggested Implementation

  • Data Structure: Double linked list.

    • Page references moved to the front of the list for recent access.

    • Pages are removed from the back during replacement.

Complexity Analysis

  • Page Replacement Complexity: O(1) for removing from the back of the list.

  • However, page referencing requires multiple pointer updates (around 6 updates), making it expensive in practice.

Conclusion

  • Practically, implementing LRU in software is costly and inefficient due to frequent updates for page references.

Approximate LRU

Motivation

  • Direct LRU implementation is not feasible in practice.

  • Approximation of LRU leads to alternatives such as the Second Chance Algorithm or Clock Algorithm.

  • This motivates the development of strategies to simplify LRU.

Variations of LRU

  • Least Frequently Used (LFU): Another approach using two queues for page management.

  • Adaptive Replacement Cache (ARC): Further manipulation for memory management.

Clock Algorithm

Overview

  • Structure: Circular arrangement of elements representing pages.

    • Each element contains:

      • Page number

      • Reference bit indicating recent usage.

      • A clock hand pointing to the last examined element.

Insertion Process

  1. Check the reference bit at the clock hand's position.

    • If 0 (not referenced): Replace the page and advance the hand.

    • If 1 (referenced): Set reference bit to 0 and advance the hand.

Page Access Process

  • If the page is already in memory, simply set the reference bit to 1 without advancing the hand.

Example Walkthrough

  • Memory Capacity: 4 pages with a continuous sequence of page requests.

  • Starting Point: All pages empty leading to initial page faults.

  • Replacement: The referenced page will replace the one the hand points to, if its reference bit is 0.

Clock Mechanism Example

  1. Initialize page references in memory.

  2. As pages are accessed (e.g., Page 1): No action if the page is in memory and reference is 1.

  3. When inserting a new page (e.g., Page 5): Assess reference bits; reset to 0 when accessed and determine which page to replace.

Memory Allocation Considerations

Swapping and Page Size

  • Swapping can slow down processes; instead, larger page sizes (e.g., 2MB or 1GB) can increase efficiency but lead to fragmentation.

Static vs. Dynamic Allocation

  • Static Allocation: Global variables are allocated at program load, remaining throughout the process without needing deallocation.

  • Dynamic Allocation: Required when memory needs vary, can be done using stack and heap.

    • Dynamic allocation may lead to fragmentation.

Fragmentation Types

  • Internal Fragmentation: Allocated memory that is not fully utilized.

  • External Fragmentation: Small unused spaces between allocations inhibit efficient memory use.

  • Key requirement: Different lifetimes and allocation sizes contribute to fragmentation issues.

Memory Allocation Strategies

Fit Strategies

  • Best Fit: Allocates the smallest block adequate for the request; can lead to fragmentation.

  • Worst Fit: Allocates the largest block; can also lead to wastage of space without optimization.

  • First Fit: Allocates the first available block that can meet the request.

Buddy Allocator

  • Allocations are made in size powers of 2 to minimize fragmentation.

  • Simplifies management through structured splitting of blocks.

  • Internal fragmentation still arises as requests are rounded up to the nearest power of 2.