Visible and Hidden Surfaces

Skin Surfaces and Physical Surfaces

  • Some people refer to skin surfaces while others refer to them as physical surfaces.

    • Definition: Both terms mean the same thing and serve the same purpose.

XY Viewpoint of a Scene

  • Introduction to the XY viewpoint in a graphical scene.

    • XY Plane: This represents the horizontal plane with x and y coordinates.

    • Z Axis: Indicates depth, going in and out of the screen.

Illustration of Basic Shapes

  • A simple drawing illustrates the following shapes:

    • A triangle in front of a quadrilateral.

  • The position of the eyeball and a virtual screen (view plane).

  • Pixels projected from the triangle and the square onto this screen.

Frame Buffer Concept

  • Introduction to the frame buffer in rendering.

    • Frame Buffer: An area in memory where pixel data is stored.

  • Example shapes projected onto the buffer:

    • Hollow Cylinder: Used as a point of reference.

    • Assurance that the audience understands the emerging problem with polygons in 3D rendering.

Hidden Surface Problem
  • Introduction to the hidden surface problem through frame buffer examples.

    • Some polygons may be at different depths creating overlap issues.

    • Visualization of depth relationships between polygons:

    • Polygon A is closer than Polygon B in a specific region, leading to a cycle of depth confusion.

  • Explanation of why the Z-buffer Algorithm is preferred over Hayter's algorithm.

Z Buffer Algorithm

  • The Z-buffer algorithm is introduced as a common method for resolving hidden surfaces.

    • Z Value: Represents distance from the viewer's point of view.

Algorithm Steps

  1. Initialization of Frame Buffer: All pixels are first set to a baseline value (z far).

  2. Iterate Through Polygons:

    • For each polygon in the scene.

    • For each pixel (x, y) within the polygon:

      • Write depth (z value) of the pixel into the buffer.

      • Assign pixel a color based on the polygon's color.

    • Example details:

      • Red polygon depth at 3.

      • Green polygon depths ranging from 2 to 4.

      • Initialization values are set to negative infinity.

Historical Context

  • Discussion of historical significance regarding frame buffers.

    • The first frame buffer with per-pixel depth was created in the mid-1970s.

    • Resource constraints: Computer memory was expensive at the time leading to innovative solutions.

Energy and Light

  • Introduction to spectral energy distributions of sources of light.

    • Different spectral energy distributions depending on the light source (e.g., lamp vs. sun).

    • Terms associated with color: Violet, blue, cyan, green, yellow, orange, red.

    • Mention of spectral energy distribution and how it relates to color perception.

Photoreceptors in the Human Eye

  • Exploration of how light interacts with the human eye and photoreceptors.

    • Photoreceptors: Cells that are sensitive to light.

    • Two main types:

    • Rods: Have a single type of photopigment called rhodopsin, responsible for vision in low light.

    • Cones: Three types designated for different light wavelengths.

      • Sensitivity:

      • Short wavelengths: Blues and violets.

      • Medium wavelengths: Greens.

      • Long wavelengths: Reds and yellows.

Cone Response to Light
  • Explanation of response mechanics of cones to wavelengths.

    • The strength of firing in response to different photon wavelengths.

    • Consequently, the eye sends color information to the brain in terms of three numerical values.

    • Observation: The limited processing of color information results in loss of detail.

Unique Human Color Perception

  • Discussion on color perception uniqueness in humans.

    • The phenomenon of Tetrachromacy in rare individuals leading to enhanced color perception.

  • Mention other species and evolution of vision.

Conclusion

  • Closing notes highlighting the diversity of eye structure evolution in nature.

  • Speculation on potential advancements in understanding photography and image rendering.

Color Space Representation
  • The first image represents a color space, specifically a chromaticity diagram.

    • Chromaticity Diagram: It visually represents the range of colors that can be perceived, mapping colors based on their chromatic components.

    • Vertices: The corners of the diagram represent pure colors: red, green, and blue, with other colors ranging in between.

    • White Point: Typically located near the center; this is the reference point for displaying colors that appear white under specific lighting conditions.

    • Axes: The axes are labeled as u' and v' representing specific color components in the CIE (Commission Internationale de l'Éclairage) color space.

Solar Radiation Spectrum
  • The second image discusses the solar radiation spectrum, indicating the following:

    • Irradiance: The graph shows spectral irradiance as a function of wavelength, outlining how sunlight interacts with the Earth's atmosphere.

    • Blackbody Spectrum: Highlighted is a blackbody spectrum at 5250°C, showing the amount of radiation emitted by a perfect blackbody at that temperature.

    • Absorption Bands: Specific wavelengths where atmospheric gases absorb sunlight, particularly water vapor, O₂, and CO₂.

    • Importance: Understanding this spectrum is crucial for scientists studying climate, ecology, and solar energy.

Pixel Depth Calculations in Rendering
  • Additional images showcase algorithms related to the Z-buffer method for rendering scenes in computer graphics.

    • Z-buffer: It represents per-pixel depth information, ensuring correct rendering of overlapping objects in a 3D scene.

    • Algorithm Steps:

      1. For each pixel (x,y), first initialize the pixel and Z (depth) values.

      2. Iterate each polygon to check pixel coverage and write Z values based on depth comparisons.

    • Conditional Statements: Important equations determine when to write new pixel colors based on depth comparisons to ensure that closer objects obscure those further away.

Conclusion
  • The interplay between color perception and light source spectra highlights numerous advancements in both fields, impacting areas like digital imaging and graphics rendering. Understanding these concepts can lead to optimized visual output in technology and art.