1/26
Looks like no tags are added yet.
Name | Mastery | Learn | Test | Matching | Spaced |
---|
No study sessions yet.
Stage 1: Application
Primarily involves processes that run on the CPU
Step 1: Creating a window where the rendered graphics will be displayed
The window must be initialized to read the graphics from the GPU framebuffer. For animated and interactive applications, the main application contains a loop that repeatedly re-renders the scene, usually aiming for a rate of 60 FPS.
Step 2: Reading data required for the rendering process
This data may include vertex attributes, which describe the appearance of the geometric shapes being rendered.
Vertex buffer objects (VBOs)
The vertex attribute data is stored in GPU memory buffers called
Texture buffers
Images to be used as textures are stored
Step 3: Sending data to the GPU
The application needs to specify the associations between attribute data stored in VBOs and attribute variables in the vertex shader program.
Vertex array objects (VAOs)
Manages sets of associations and stores information that can be activated and deactivated as needed during the rendering process.
Stage 2: Geometry Processing
Determining the position of each vertex of the geometric shapes to be rendered, implemented by a program known as vertex shader
Mesh
The shape of a geometric object is defined by a collection of points that are grouped into lines or triangles.
Vertex
Main Definition: The properties or attributes that are specific to rendering each individual point are grouped together into a data structure is called
Other Information: Should contain the three-dimensional position of the corresponding point.
Texture coordinates (UV coordinates)
Indicate a point in an image that is mapped to the vertex
Normal vector
Indicates the direction perpendicular to a surface and is typically used in lighting calculations
Step 1: Model transformation
The collection of points defining the intrinsic shape of an object may be translated, rotated, and scaled. Hence, the object appears to have a particular location, orientation, and size with respect to a virtual three-dimensional world.
World Space
The origin is at the center of the scene, coordinates expressed from this frame of reference.
Step 2: View Transformation
Coordinates in this context are said to be in view space
View Space (Camera space or Eye space)
The result when world-space coordinates are transformed to coordinates in front of the user's view.
Stage 3: Projection Transformation
Any points outside the specified region are discarded or clipped from the scene
Clip Space
Coordinates expressed in the stage of Projection Transformation
Stage 3: Rasterization
This stage begins once the vertex shader has specified the final positions of each vertex. The points themselves must first be grouped into the desired type of geometric primitive: points, lines, or triangles, consisting of sets of 1, 2, or 3 points.
Line Strip
Set of connected line segments
Primitive Assembly
The process of grouping points into geometric primitives
Fragment
A collection of data used to determine the color of a single pixel in a rendered image; is created for each pixel corresponding to the interior of a shape.
Raster Position (Pixel Coordinates)
Data stored in a fragment always includes this
Stage 4: Pixel Processing
The primary purpose of this stage is to determine the final color of each pixel, storing this data in the color buffer within the framebuffer.
Fragment Shader
Main Definition: This program is applied to each of the fragments to calculate their final color.
Other Information: This calculation may involve a variety of data stored in each fragment, in combination with data globally available during rendering like base colors, colors in each fragment (from vertex colors), textures and light sources.
Textures
Images applied to the surface of the shape
Light Sources
Whose relative position and/or orientation may lighten or darken the color, depending on the direction the surface is facing at a point, specified by normal vectors