3
OpenGL is a computer graphics rendering API – With it, you can generate high-quality color images by rendering with geometric and image primitives – It forms the basis of many interactive applications that include 3D graphics – By using OpenGL, the graphics part of your application can be operating system independent window system independent What Is OpenGL?

4
Well concentrate on the latest versions of OpenGL They enforce a new way to program with OpenGL Allows more efficient use of GPU resources If youre familiar with classic graphics pipelines, modern OpenGL doesnt support Fixed-function graphics operations lighting transformations All applications must use shaders for their graphics processing Course Ground Rules

5
The Evolution of the OpenGL Pipeline

6
OpenGL 1.0 was released on July 1 st, 1994 Its pipeline was entirely fixed-function the only operations available were fixed by the implementation The pipeline evolved, but remained fixed-function through OpenGL versions 1.1 through 2.0 (Sept. 2004) In the Beginning … Primitive Setup and Rasterization Primitive Setup and Rasterization Fragment Coloring and Texturing Blending Vertex Data Pixel Data Vertex Transform and Lighting Texture Store

8
OpenGL 3.0 introduced the deprecation model the method used to remove features from OpenGL The pipeline remained the same until OpenGL 3.1 (released March 24 th, 2009) Introduced a change in how OpenGL contexts are used An Evolutionary Change Context TypeDescription Full Includes all features (including those marked deprecated) available in the current version of OpenGL Forward Compatible Includes all non-deprecated features (i.e., creates a context that would be similar to the next version of OpenGL)

11
OpenGL 3.2 also introduced context profiles profiles control which features are exposed its like GL_ARB_compatibility, only not insane currently two types of profiles: core and compatible More Evolution – Context Profiles Context TypeProfileDescription Full coreAll features of the current release compatibleAll features ever in OpenGL Forward Compatible coreAll non-deprecated features compatibleNot supported

13
OpenGL ES and WebGL OpenGL ES 2.0 Designed for embedded and hand-held devices such as cell phones Based on OpenGL 3.1 Shader based WebGL JavaScript implementation of ES 2.0 Runs on most recent browsers

17
OpenGL applications need a place to render into usually an on-screen window Need to communicate with native windowing system Each windowing system interface is different We use GLUT (more specifically, freeglut) simple, open-source library that works everywhere handles all windowing operations: opening windows input processing Application Framework Requirements

18
Operating systems deal with library functions differently compiler linkage and runtime libraries may expose different functions Additionally, OpenGL has many versions and profiles which expose different sets of functions managing function access is cumbersome, and window-system dependent We use another open-source library, GLEW, to hide those details Simplifying Working with OpenGL

19
Geometric objects are represented using vertices A vertex is a collection of generic attributes positional coordinates colors texture coordinates any other data associated with that point in space Position stored in 4 dimensional homogeneous coordinates Vertex data must be stored in vertex buffer objects (VBOs) VBOs must be stored in vertex array objects (VAOs) Representing Geometric Objects

22
Well render a cube with colors at each vertex Our example demonstrates: initializing vertex data organizing data for rendering simple object modeling building up 3D objects from geometric primitives building geometric primitives from vertices Our First Program

23
Well build each cube face from individual triangles Need to determine how much storage is required (6 faces)(2 triangles/face)(3 vertices/triangle) const int NumVertices = 36; To simplify communicating with GLSL, well use a vec4 class (implemented in C++) similar to GLSLs vec4 type well also typedef it to add logical meaning typedef vec4 point4; typedef vec4 color4; Initializing the Cubes Data

24
Before we can initialize our VBO, we need to stage the data Our cube has two attributes per vertex position color We create two arrays to hold the VBO data point4 points[NumVertices]; color4 colors[NumVertices]; Initializing the Cubes Data (contd)

29
VAOs store the data of an geometric object Steps in using a VAO 1. generate VAO names by calling glGenVertexArrays() 2. bind a specific VAO for initialization by calling glBindVertexArray() 3. update VBOs associated with this VAO 4. bind VAO for use in rendering This approach allows a single function call to specify all the data for an objects – previously, you might have needed to make many calls to make all the data current Vertex Array Objects (VAOs)

31
Vertex data must be stored in a VBO, and associated with a VAO The code-flow is similar to configuring a VAO 1. generate VBO names by calling glGenBuffers() 2. bind a specific VBO for initialization by calling glBindBuffer( GL_ARRAY_BUFFER, … ) 3. load data into VBO using glBufferData( GL_ARRAY_BUFFER, … ) 4. bind VAO for use in rendering glBindVertexArray() Storing Vertex Attributes

43
gl_Position : output position from vertex shader gl_FragColor : output color from fragment shader Only for ES, WebGL and older versions of GLSL Present version use an out variable Built-in Variables

46
Shaders need to be compiled and linked to form an executable shader program OpenGL provides the compiler and linker A program must contain – vertex and fragment shaders – other shaders are optional Getting Your Shaders into OpenGL Create Shader Load Shader Source Compile Shader Create Program Attach Shader to Program Link Program glCreateProgram() glShaderSource() glCompileShader() glCreateShader() glAttachShader() glLinkProgram() Use Program glUseProgram() These steps need to be repeated for each type of shader in the shader program

47
Weve created a routine for this course to make it easier to load your shaders available at course website GLuint InitShaders( const char* vFile, const char* fFile); InitShaders takes two filenames vFile for the vertex shader fFile for the fragment shader Fails if shaders dont compile, or program doesnt link A Simpler Way

53
A vertex shader is initiated by each vertex output by glDrawArrays() A vertex shader must output a position in clip coordinates to the rasterizer Basic uses of vertex shaders Transformations Lighting Movin g vertex positions Vertex Shader Examples

57
Projection transformations adjust the lens of the camera Viewing transformations tripod–define position and orientation of the viewing volume in the world Modeling transformations moving the model Viewport transformations enlarge or reduce the physical photograph Camera Analogy and Transformations

58
– matrices are always post-multiplied – product of matrix and vector is A vertex is transformed by 4×4 matrices – all affine operations are matrix multiplications – all matrices are stored column-major in OpenGL this is opposite of what C programmers expect 3D Transformations

59
Set up a viewing frustum to specify how much of the world we can see Done in two steps specify the size of the frustum (projection transform) specify its location in space (model-view transform) Anything outside of the viewing frustum is clipped primitive is either modified or discarded (if entirely outside frustum) Specifying What You Can See

60
OpenGL projection model uses eye coordinates the eye is located at the origin looking down the -z axis Projection matrices use a six-plane model: near (image) plane and far (infinite) plane both are distances from the eye (positive values) enclosing planes top & bottom, left & right Specifying What You Can See (contd)

71
Computes a color or shade for each vertex using a lighting model (the modified Phong model) that takes into account Diffuse reflections Specular reflections Ambient light Emission Vertex shades are interpolated across polygons by the rasterizer Modified Phong Model

72
The model is a balance between simple computation and physical realism The model uses Light positions and intensities Surface orientation (normals) Material properties (reflectivity) Viewer location Computed for each source and each color component The Modified Phong Model

74
Normals define how a surface reflects light Application usually provides normals as a vertex atttribute Current normal is used to compute vertexs color Use unit normals for proper lighting scaling affects a normals length Surface Normals

80
A shader thats executed for each potential pixel fragments still need to pass several tests before making it to the framebuffer There are lots of effects we can do in fragment shaders Per-fragment lighting Bump Mapping Environment (Reflection) Maps Fragment Shaders

81
Compute lighting using same model as for per vertex lighting but for each fragment Normals and other attributes are sent to vertex shader and output to rasterizer Rasterizer interpolates and provides inputs for fragment shader Per Fragment Lighting

83
A height field is a function y = f(x, z) where the y value represents a quantity such as the height above a point in the x-z plane. Heights fields are usually rendered by sampling the function to form a rectangular mesh of triangles or rectangles from the samples y ij = f(x i, z j ) Height Fields

84
Form a quadrilateral mesh Display each quad using Displaying a Height Field for(i=0;i

87
Solid Mesh: create two triangles for each quad Display with glDrawArrays(GL_TRIANGLES, 0, NumVertices); For better looking results, well add lighting Well do per-vertex lighting leverage the vertex shader since well also use it to vary the mesh in a time-varying way Adding Lighting