1/99
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
|---|
No study sessions yet.
Graphics Pipeline Diagram
A flowchart showing how 3D data becomes a 2D image (modeling → viewing → projection → clipping → rasterizing → shading → output).
Modeling Stage
Places objects into the world using transforms like scale, rotate, translate.
Viewing Stage
Moves and orients the camera so the world is seen from the correct viewpoint.
Projection Stage
Converts 3D view into a 2D image using orthographic or perspective projection.
Clipping Stage
Removes geometry outside the camera frustum so only visible objects are processed.
Rasterization Stage
Converts triangles into fragments/pixels.
Shading Stage
Computes lighting and color for each visible fragment.
Output Stage
Displays the final processed pixels onto the screen.
Object Coordinates
Local coordinate system of an individual model before being placed in the world.
World Coordinates
Shared scene coordinate system where all objects exist together.
Camera/Eye Coordinates
Coordinates expressed relative to the camera’s position and orientation.
Clip Coordinates
4D coordinates after projection but before perspective division used for clipping.
NDC Coordinates
Standardized cube (−1 to 1) representing visible space after perspective division.
Screen Coordinates
Pixel coordinates after mapping NDC to the viewport.
Transformations Between Coordinate Systems
Matrices that convert points from object → world → eye → clip → NDC → screen.
Rasterization
The process of turning triangles into pixel-sized fragments.
Lighting/Shading
Calculating brightness and color per fragment based on lights and materials.
Visible Surface Determination
Algorithms deciding which surfaces are in front.
Rasterization Diagram
A triangle filled pixel-by-pixel.
Lighting Diagram
Shows light direction, normal, and reflection.
Visibility Diagram
Shows occlusion using depth.
Fixed Pipeline Stages
Non-programmable GPU stages such as clipping, rasterizing, depth testing.
Translation Matrix
A matrix that moves an object in x, y, and z.
Rotation Matrix
A matrix that rotates an object around an axis.
Scaling Matrix
A matrix that enlarges or shrinks an object.
Combining Transformations
Applying transforms in sequence where the order affects the result.
Transformation Sequence
Ordered list of transforms like scale → rotate → translate.
Camera Position
The eye location in world coordinates.
Look-at Point
The point the camera is aimed at.
Up Vector
The direction that defines “up” for the camera.
Camera Basis (u,v,w)
Camera’s right, up, and backward axes used to build the view matrix.
gluLookAt
Function that constructs the viewing transformation matrix.
Viewing Formula V = R T
Viewing transformation equals rotation times translation.
Ortho Matrix Derivation
Built using translate, scale, and reflect operations to normalize the view volume.
Similar Triangles Projection
Reason distant objects appear smaller based on triangle proportions.
Homogeneous Coordinates
4D representation needed for translation and perspective in matrix form.
W Component
Stores depth information for perspective foreshortening.
Perspective Division
Dividing x,y,z by w to apply perspective effects.
Perspective Projection Matrix
Built from scaling, shearing, and translating to form a frustum.
Global Illumination
Lighting model including indirect light bounces.
Local Illumination
Lighting based only on direct light.
Ambient Term
Constant background light applied everywhere.
Diffuse Term
Light depending on angle between normal and light direction.
Specular Term
Shiny highlight depending on viewer and reflection vector.
Phong Dot Products
n·l and r·v calculations determining diffuse and specular lighting.
Phong Diagram
Diagram showing normal, light, view, and reflection directions.
Calculator Use
Needed for Phong calculations when numbers are provided.
Flat Shading
Lighting once per polygon producing faceted surfaces.
Gouraud Shading
Lighting per vertex then interpolated across the polygon.
Phong Shading
Lighting per fragment giving smooth, accurate results.
BRDF
Function describing how surfaces reflect light in different directions.
Texture Mapping Basics
Applying a 2D image onto a 3D object using texture coordinates.
Texture Coordinate Systems
Texture space (s,t), parametric space (u,v), object space, screen space.
Two-Part Mapping
Mapping texture to an intermediate shape then onto a complex object.
Environment Mapping
Using cube or sphere maps to simulate reflections.
Bump Mapping
Fake bumps by modifying surface normals using a texture.
Minification Problem
Too many texels per pixel causing aliasing.
Magnification Problem
Too few texels per pixel causing pixelation.
Mipmapping
Multiple resolutions of textures used to reduce aliasing at distance.
Anisotropic Filtering
Improves texture quality at steep viewing angles.
Cohen–Sutherland Clipping
Line clipping using region outcodes.
Sutherland–Hodgman Clipping
Polygon clipping by clipping against each plane sequentially.
DDA Algorithm
Line rasterization using incremental steps.
Bresenham’s Algorithm
Efficient line rasterization choosing nearest integer pixel.
Triangle Rasterization
Determining which pixels lie inside a triangle.
Inside/Outside Tests
Using edge functions or barycentric coordinates to test containment.
Polygon Rasterization
Filling polygons by splitting into triangles or scanning.
Tile-Based Rasterization
Dividing the screen into small tiles for efficient rasterization.
Viewport Transform Matrix
Converts NDC coordinates to actual screen pixel positions.
Back-Face Culling
Removing triangles facing away from the camera.
Painter’s Algorithm
Draw far objects first then near ones.
Z-Buffer Algorithm
Keeps the closest fragment by comparing depth values.
Depth Non-Linearity
Z precision is higher near the camera and lower far away.
Anti-Aliasing Basics
Techniques for reducing jagged edges.
Primitive Assembly
Combining vertices into triangles or lines.
View-Volume Clipping
Removing geometry outside the canonical view volume.
Perspective Division Stage
Converts clip coordinates into NDC by dividing by w.
Viewport Transformation Stage
Converts NDC to screen coordinates.
Rasterization Stage
Converts primitives into fragments for shading.
Interpolated Attributes
Values like depth, color, and texcoords interpolated across triangles.
1/w Interpolation
Required for correct perspective texture mapping and depth interpolation.
Programmable Pipeline Stages
Vertex and fragment shaders that can run user-defined code.
Fixed Pipeline Stages
Stages that cannot be programmed (clip, rasterize, depth test).
Vertex Shader
Processes each vertex and outputs clip-space position and varyings.
Vertex Shader Inputs
Position, normal, texcoord, and other per-vertex attributes.
Vertex Shader Outputs
Clip-space position and interpolated varyings.
Fragment Shader
Computes final per-pixel color using interpolated data.
Fragment Shader Inputs
Interpolated varyings like color, texcoords, normals.
Fragment Shader Outputs
Final pixel color written to the framebuffer.
Uniform Variables
Constant values used by all shader invocations (lights, matrices, etc.).
Varying/In-Out Variables
Data passed from vertex to fragment shader via interpolation.
Rasterizer Interpolation
Hardware computes how varyings change across a triangle.
Per-Fragment Phong
Phong lighting computed in the fragment shader for accuracy.
Eye-Space Position in Shaders
Vertex shader passes eye-space coordinates for lighting math.
Normals in Shaders
Vertex shader outputs normals for interpolation and fragment lighting.
Ray Tracing Algorithm
Simulates rays to compute lighting, shadows, reflections, and refractions.
Primary Rays
Rays cast from the camera through each pixel.
Secondary Rays
Reflection, shadow, and refraction rays spawned after intersections.
Ray Tracing Diagram
Illustration of rays hitting surfaces and bouncing.
Ray Tracing Pseudocode
Step-by-step outline of casting rays and computing color recursively.