Shader
In computer graphics, a shader is a type of computer program that was originally used for shading (the production of appropriate levels of light, darkness, and color within an image) but which now performs a variety of specialized functions in various fields of computer graphics special effects or does video post-processing unrelated to shading, or even functions unrelated to graphics at all.
Shaders calculate rendering effects on graphics hardware with a high degree of flexibility. Most shaders are coded for a graphics processing unit (GPU), though this is not a strict requirement. Shading languages are usually used to program the programmable GPU rendering pipeline, which has mostly superseded the fixed-function pipeline that allowed only common geometry transformation and pixel-shading functions; with shaders, customized effects can be used. The position, hue, saturation, brightness, and contrast of all pixels, vertices, or textures used to construct a final image can be altered on the fly, using algorithms defined in the shader, and can be modified by external variables or textures introduced by the program calling the shader.
Shaders are used widely in cinema postprocessing, computer-generated imagery, and video games to produce a very wide range of effects. Beyond just simple lighting models, more complex uses include altering the hue, saturation, brightness or contrast of an image, producing blur, light bloom, volumetric lighting, normal mapping for depth effects, bokeh, cel shading, posterization, bump mapping, distortion, chroma keying (so-called "bluescreen/greenscreen" effects), edge detection and motion detection, psychedelic effects, and many others.
Contents
1 History
2 Design
3 Types
3.1 2D Shaders
3.1.1 Pixel shaders
3.2 3D Shaders
3.2.1 Vertex shaders
3.2.2 Geometry shaders
3.2.3 Tessellation shaders
3.2.4 Primitive shaders
3.3 Other
3.3.1 Compute shaders
4 Parallel processing
5 Programming
6 See also
7 References
8 Further reading
9 External links
History
The modern use of "shader" was introduced to the public by Pixar with their "RenderMan Interface Specification, Version 3.0" originally published in May 1988.[1]
As graphics processing units evolved, major graphics software libraries such as OpenGL and Direct3D began to support shaders. The first shader-capable GPUs only supported pixel shading, but vertex shaders were quickly introduced once developers realized the power of shaders. The first video card with programmable pixel shader was the Nvidia GeForce 3 (NV20), released in 2000. [1] Geometry shaders were introduced with Direct3D 10 and OpenGL 3.2. Eventually graphics hardware evolved toward a unified shader model.
Design
Shaders are simple programs that describe the traits of either a vertex or a pixel. Vertex shaders describe the traits (position, texture coordinates, colors, etc.) of a vertex, while pixel shaders describe the traits (color, z-depth and alpha value) of a pixel. A vertex shader is called for each vertex in a primitive (possibly after tessellation); thus one vertex in, one (updated) vertex out. Each vertex is then rendered as a series of pixels onto a surface (block of memory) that will eventually be sent to the screen.
Shaders replace a section of the graphics hardware typically called the Fixed Function Pipeline (FFP), so-called because it performs lighting and texture mapping in a hard-coded manner. Shaders provide a programmable alternative to this hard-coded approach.[2]
The basic graphics pipeline is as follows:
- The CPU sends instructions (compiled shading language programs) and geometry data to the graphics processing unit, located on the graphics card.
- Within the vertex shader, the geometry is transformed.
- If a geometry shader is in the graphic processing unit and active, some changes of the geometries in the scene are performed.
- If a tessellation shader is in the graphic processing unit and active, the geometries in the scene can be subdivided.
- The calculated geometry is triangulated (subdivided into triangles).
- Triangles are broken down into fragment quads (one fragment quad is a 2 × 2 fragment primitive).
- Fragment quads are modified according to the fragment shader.
- The depth test is performed, fragments that pass will get written to the screen and might get blended into the frame buffer.
The graphic pipeline uses these steps in order to transform three-dimensional (or two-dimensional) data into useful two-dimensional data for displaying. In general, this is a large pixel matrix or "frame buffer".
Types
There are three types of shaders in common use, with one more recently added. While older graphics cards utilize separate processing units for each shader type, newer cards feature unified shaders which are capable of executing any type of shader. This allows graphics cards to make more efficient use of processing power.
2D Shaders
2D shaders act on digital images, also called textures in computer graphics work. They modify attributes of pixels. 2D shaders may take part in rendering 3D geometry. Currently the only 2D shader types are pixel shaders.
Pixel shaders
Pixel shaders, also known as fragment shaders, compute color and other attributes of each "fragment" - a technical term usually meaning a single pixel. The simplest kinds of pixel shaders output one screen pixel as a color value; more complex shaders with multiple inputs/outputs are also possible[3]. Pixel shaders range from always outputting the same color, to applying a lighting value, to doing bump mapping, shadows, specular highlights, translucency and other phenomena. They can alter the depth of the fragment (for Z-buffering), or output more than one color if multiple render targets are active. In 3D graphics, a pixel shader alone cannot produce very complex effects, because it operates only on a single fragment, without knowledge of a scene's geometry. However, pixel shaders do have knowledge of the screen coordinate being drawn, and can sample the screen and nearby pixels if the contents of the entire screen are passed as a texture to the shader. This technique can enable a wide variety of two-dimensional postprocessing effects, such as blur, or edge detection/enhancement for cartoon/cel shaders. Pixel shaders may also be applied in intermediate stages to any two-dimensional images—sprites or textures—in the pipeline, whereas vertex shaders always require a 3D scene. For instance, a pixel shader is the only kind of shader that can act as a postprocessor or filter for a video stream after it has been rasterized.
3D Shaders
3D shaders act on 3D models or other geometry but may also access the colors and textures used to draw the model or mesh. Vertex shaders are the oldest type of 3D shader, generally modifying on a per-vertex basis. Geometry shaders can generate new vertices from within the shader. Tessellation shaders are newer 3D shaders that act on batches of vertices all at once to add detail—such as subdividing a model into smaller groups of triangles or other primitives at runtime, to improve things like curves and bumps, or change other attributes.
Vertex shaders
Vertex shaders are the most established and common kind of 3D shader and are run once for each vertex given to the graphics processor. The purpose is to transform each vertex's 3D position in virtual space to the 2D coordinate at which it appears on the screen (as well as a depth value for the Z-buffer)[4]. Vertex shaders can manipulate properties such as position, color and texture coordinates, but cannot create new vertices. The output of the vertex shader goes to the next stage in the pipeline, which is either a geometry shader if present, or the rasterizer. Vertex shaders can enable powerful control over the details of position, movement, lighting, and color in any scene involving 3D models.
Geometry shaders
Geometry shaders are a relatively new type of shader, introduced in Direct3D 10 and OpenGL 3.2; formerly available in OpenGL 2.0+ with the use of extensions.[5] This type of shader can generate new graphics primitives, such as points, lines, and triangles, from those primitives that were sent to the beginning of the graphics pipeline.[6]
Geometry shader programs are executed after vertex shaders. They take as input a whole primitive, possibly with adjacency information. For example, when operating on triangles, the three vertices are the geometry shader's input. The shader can then emit zero or more primitives, which are rasterized and their fragments ultimately passed to a pixel shader.
Typical uses of a geometry shader include point sprite generation, geometry tessellation, shadow volume extrusion, and single pass rendering to a cube map. A typical real-world example of the benefits of geometry shaders would be automatic mesh complexity modification. A series of line strips representing control points for a curve are passed to the geometry shader and depending on the complexity required the shader can automatically generate extra lines each of which provides a better approximation of a curve.
Tessellation shaders
As of OpenGL 4.0 and Direct3D 11, a new shader class called a tessellation shader has been added. It adds two new shader stages to the traditional model: tessellation control shaders (also known as hull shaders) and tessellation evaluation shaders (also known as Domain Shaders), which together allow for simpler meshes to be subdivided into finer meshes at run-time according to a mathematical function. The function can be related to a variety of variables, most notably the distance from the viewing camera to allow active level-of-detail scaling. This allows objects close to the camera to have fine detail, while further away ones can have more coarse meshes, yet seem comparable in quality. It also can drastically reduce required mesh bandwidth by allowing meshes to be refined once inside the shader units instead of downsampling very complex ones from memory. Some algorithms can upsample any arbitrary mesh, while others allow for "hinting" in meshes to dictate the most characteristic vertices and edges.
Primitive shaders
AMD Vega microarchitecture added support for a new shader stage - primitive shaders.[7][8]
Other
Compute shaders
Compute shaders are not limited to graphics applications, but use the same execution resources for GPGPU. They may be used in graphics pipelines e.g. for additional stages in animation or lighting algorithms, (e.g. tiled forward rendering). Some rendering APIs allow compute shaders to easily share data resources with the graphics pipeline.
Parallel processing
Shaders are written to apply transformations to a large set of elements at a time, for example, to each pixel in an area of the screen, or for every vertex of a model. This is well suited to parallel processing, and most modern GPUs have multiple shader pipelines to facilitate this, vastly improving computation throughput.
A programming model with shaders is similar to a higher order function for rendering, taking the shaders as arguments, and providing a specific dataflow between intermediate results, enabling both data parallelism (across pixels, vertices etc.) and pipeline parallelism (between stages). (see also map reduce).
Programming
The language in which shaders are programmed depends on the target environment. The official OpenGL and OpenGL ES shading language is OpenGL Shading Language, also known as GLSL, and the official Direct3D shading language is High Level Shader Language, also known as HLSL. However, Cg is a deprecated third-party shading language developed by Nvidia that outputs both OpenGL and Direct3D shaders. Apple released its own shading language called Metal Shading Language as part of the Metal framework.
See also
- GLSL
- Spir-V
- HLSL
- Compute kernel
- Shading language
- GPGPU
- List of common shading algorithms
- Vector processor
References
^ "The RenderMan Interface Specification".
^ "ShaderWorks' update - DirectX Blog". 13 August 2003.
^ "GLSL Tutorial – Fragment Shader". 9 June 2011.
^ "GLSL Tutorial – Vertex Shader". 9 June 2011.
^ Geometry Shader - OpenGL. Retrieved on 2011-12-21.
^ "Pipeline Stages (Direct3D 10) (Windows)". msdn.microsoft.com.
^ "Radeon RX Vega Revealed: AMD promises 4K gaming performance for $499 - Trusted Reviews". 31 July 2017.
^ "The curtain comes up on AMD's Vega architecture".
Further reading
Upstill, Steve. The RenderMan Companion: A Programmer's Guide to Realistic Computer Graphics. Addison-Wesley. ISBN 0-201-50868-0.
Ebert, David S; Musgrave, F. Kenton; Peachey, Darwyn; Perlin, Ken; Worley, Steven. Texturing and modeling: a procedural approach. AP Professional. ISBN 0-12-228730-4.
Fernando, Randima; Kilgard, Mark. The Cg Tutorial: The Definitive Guide to Programmable Real-Time Graphics. Addison-Wesley Professional. ISBN 0-321-19496-9.
Rost, Randi J. OpenGL Shading Language. Addison-Wesley Professional. ISBN 0-321-19789-5.
External links
- OpenGL geometry shader extension
Riemer's DirectX & HLSL Tutorial: HLSL Tutorial using DirectX with lots of sample code- Pipeline Stages (Direct3D 10)
Clash Royale CLAN TAG#URR8PPP