Interactive Computer Graphics
Interactive Computer Graphics CS 570
Popular in Course
Popular in ComputerScienence
This 280 page Class Notes was uploaded by Abe Jones on Saturday September 12, 2015. The Class Notes belongs to CS 570 at West Virginia University taught by Timothy McGraw in Fall. Since its upload, it has received 17 views. For similar materials see /class/202760/cs-570-west-virginia-university in ComputerScienence at West Virginia University.
Reviews for Interactive Computer Graphics
Report this Material
What is Karma?
Karma is the currency of StudySoup.
Date Created: 09/12/15
Shadow Techniques Shadow Volumes Shadow Mapping What is a shadow umbra penumbra An area that is not illuminated or is only partially illuminated because of the interception of radiation by an opaque object between the area and the source of radiation Why add shadows I Set mood Realtime shadows Zaxxon 1982 3239 S EGG L982 Realtime shadows Representing shadows 2 common methods The polygonal boundary of Depth map of scene from occluded volume the light pointof view Shadow volumes Geometric approach to shadows Similar to the light volumes drawn in deferred shading Extrude a shadow volume from silhouette edges of mesh in light direction Use stencil buffer mark pixels where the volume intersects other objects Shadow casters need modified geometry Or geometry shader Silhouette determination For a continuous smooth surface light Silhouette determination For closed polygonal meshes adjacent faces share an edge39 Igm Edge e between face i and face j is a silhouette edge if niZlgt0 and n jZ 130 Silhouette extrusion The shadow volume is defined by the points on the silhouette p and those points pushed away from the light p l39 light Which fragments are in shadow Render scene to depth buffer Render backfront faces to stencil buffer Render frontback faces to stencil buffer Stencil value determines which fragments are in shadow GPU approach Use two meshes for each shadow caster the original mesh and a mesh to represent the shadow volume Draw scene in shadow color Silhouette determination and extrusion happen in vertex shader Draw volumes to stencil Draw lit color where stencil test passes Shadow volume representation The quadedge mesh Created from the original shadow caster mesh For each edge in that mesh form a degenerate zeroarea quadrilateral One pair of vertices has one face normal the other pair has the other face normal In VP l Form light vector Compute clot product with normal Extrude vertices when n dotl lt O EdgeQuad Mesh I v i agggni Wamp w WW MZA Mamp g K g Faces Stencil Tests Zpass Render front faces increment stencil when depth test passes Render back faces decrement stencil when depth test passes Zfail Render back faces increment stencil if depth test fails Render front faces decrement stencil if depth test fails Can use two sidedstencil in a single pass Order independent when using WRAP stencil operations Inverse shadow artifacts with Z pass Zpass vs Zfail Zfail Zpass vs Zfail What happens when camera is in shadow volume Only baciglfaces as defined by winding order are visible Zfail Shadow volume capping Zfail requires caps at the ends of the shadow volume far end cap Zfall What would have happened without this cap Shadow Volumes We also have a problem in zfail if the shadow volume intersects the far clipping plane Why is this not a problem with zpass Solution put the far clip plane at infinity Infinite view volume near far right left nfr 2n rl O O r Z r l 2n t b O O 1 t b t b 0 fn 2fn f n f n O O 1 O Take limit as fgoes to infinity Remember L39Hopital39s rule Infinite view volume Near far right left nfr 2n rl O O r Z r Z 2n tb hnl wPfQ 0 f t b t b O O 1 2n 0 O 1 O Using this projection matrix we don39t have to worry about clipping the shadow volume Z pass vs Z fail Zpass No need to compute draw caps No need for infinite viewvolume Problem when camera is inside view volume Zfail No problem when camera is inside view volume More geometry to compute render Requires far clip plane at infinity Extrusion to infinity As the light gets close to an occluder the shadow may not project far into the scene if the extrusion distance is finite Extrusion to infinity px W x pextmded PVM py Irv y glPosition gModeIViewProjectionMatrix pz w Z vec4gVertexxyz lightw lightxyz 00 O Homogeneous point with wO can be thought of as a vector or as a point at infinity For a directional light lightw 0 and glPosition will be infinitely far away in the ight direction Since parallel lines meet at infinity we don39t need a far end cap for directional lights For a point light ightw 10 and glPosition will be infinitely far away in the vertexlight direction Hybrid PassFail Test to see if camera is in shadow If not use zpass If so use zfail Can you see the near cap If so add unextruded faces to near cap Can you see the far cap If so add extruded faces to far cap Soft shadows with shadow volumes Summing jittered hard shadows Penumbra wedges Tomas AkenineMoller and UlfAssarsson Approximate Soft Shadows on Arbitrary Surfaces using PenumbraWedges Eurographics 2002 light source I r I shadow castmgquot object v o x A penumbra wedge penumbra wedge entry point p exit point p b Shadow mapping Imagespace technique Rendering from camera pointofview results with zbuffering gives a visible surface representation of the scene What is in the framebuffer and depth buffer is visible Everything else is not Rendering from light pointof view results in a lit surface representation of the scene Whatever passes the depth test is lit Everything else is in shadow Supported with several OpenGL and GLSL features Depth textures FBO shadow map samplers Shadow mapping 2 Passes Initialization Create depth texture and attach to FBO Pass 1 Render scene from light position and direction Only need the depth buffer no color texture Render to framebuffer attached depth texture Pass 2 Render scene from camera position and direction Transform each fragment into light space Read from depth texture lf light space depth gt what is in depth texture then the fragment is shadowed Pass 1 Renderto depth texture Consider the spotlight as a camera Projection matrix PL characterizes the spotlight cone angle and cutoff distance Analogous to camera fieldof view and far clip distance View matrix VL characterizes the light direction Analogous to camera look vector A point p in local coordinates can be transformed into light clip coordinates by pLZPLVLMp Use this transformation matrix in the vertex program in stage 1 Pass 2 Compare to depth Recall the scale and bias matrix from projective textures O Nli O 4 II o o o Nli o owli o This matrix transforms points from clip coordinates 11 to texture coordinates 01 We will need to use this when reading from the depth texture Pass 2 Compare to depth In vertex shader convert vertex coordinates into light space texture coordinates and find the depth Modeling Matrix M Light View Matrix VL Camera View Matrix V Light Projection Matrix PL Camera Projection Matrix P plexZSPLVLV Pass 2 Vertex Program In fragment processor compute the corresponding depth texture coordinate varying vec4 ShadowCoord void mainvoid glPosition glModeIViewProjectionMatrixgVerteX ShadowCoord ScaleAndBias LightProjectionMatriX LightViewMatriX CameraViewlnverse gIModeIViewMatrix gVertex In practice compute the product of these matrices onthe CPU and load as the texture matrix In practice Compute product of matrices once on CPU Store in texture matrix In GLSL gMatrixModeGLTEXTURE glPushMatrixO glLoadIdentity ngranslatefO5f 05f 05f ngcaIefO5f 05f 05f gluPerspective450f 10f Near Far gluLookAtLightPosX LightPosY LightPosZ 00f 00f 00f 00f 10f 00f gIMultMatrixInverseViewMatriX ShadowCoord glTextureMatrix0 glModeIViewMatrix glVerteX Pass 2 Fragment Program Use shadow2DProj to return the result of depth comparison uniform sampler2DShadow ShadowMap varying vec4 ShadowCoord void mainvoid ighting computations here gFragCoor diffuseColor specularColor shadOV2DProjShadowMap ProjShadowr ambientColor For shadow samplers set ngexParameteriGLTEXTURE2D GLCOMPAREMODEARB GLCOM PARERTOTEXTU RE Shadow map sampleraccess options uniform samplerZDShadow ShadowMap shadow2DProjShadowMap ProjShadowr Or shadow2DShadowMap ProjShadowstProjShadowqr uniform sampler2D DepthMap z texture2DProjDepthMap ProjShadowr Ortexture2DDepthMap ProjShadowstProjShadowqr Then do comparison manually float r ProjCoordp ProjCoordq float shadow floatr gt 2 Artifacts Depth precision Use glPolygonOffset to push depth values back when rendering to depth texture Also depends on resolution we only have a single depth value per texel Fixing depth problems glPolygonOffsetGLfloat factor Glfloat units Depth values will be offset by factortdzetumm dz Slope ofz in screen space a Smallest value guaranteed to produce a resolvable offset Positive offset push objects away Offset happens after perspective division 2 values are no longer linear demh anordimte lpacin Why should the offset depend on slope Fixing depth problems Other approaches Use a modified projection matrix which applies an offset Draw back faces into shadow map Moves artifacts to the shadow side may be less noticeable Draw front faces into one shadow map and back faces into another shadow map Use the average of the 2 depths midpoint in comparisons May still have problems with thin objects Artifacts Aliasing Increase resolution of shadow map Artifacts Aliasing Can39t be solved by filtering depth values Depth comparison is boolean Nearest neighbor Linear filtering Artifacts Aliasing Percentage Closer Filtering PCF Compare then average boolean results 2X2 pof oonst float teer 105120 oonst float w 025 float pof veo4 depths depthsO texture2DShadowMap ProjCoordstProjCoordq texelveo2 depths1 texture2DShadowMap ProjCoordstProjCoordq texelveo2 depths2 texture2DShadowMap ProjCoordstProjCoordq texelveo2 depths3 texture2DShadowMap ProjCoordstProjCoordq texelveo2 w wr w w w w w wr r r39 AAAA float r ProjCoordp ProjCoordq veo4 inShadow veo4greaterThanEquaveo4r r r r depths return dotinShadow veo4025 025 025 025 PCF With 2x2 PCF you get 5 possible shadowing percentages w 0 w 025 w 050 w 075 no PCF More percentages are possible with larger filters Nvidia and ATI have implemented PCF in hardware Simply enable linear filtering and compare mode Soft shadows Percentage Closer Soft Shadows PCSS PCF kernel size depends on distance to light Variance shadow maps VSM Variance shadow maps Think in terms of the distribution of depths in a neighborhood Density function pz Cumulative distribution function PltZsZgtZzisz pzii This is the fraction of pointwhich are unshadowed Chebyshev39s inequality PZ2ts Use this bound as the fraction of shadowed points Variance shadow maps Persm This is an equality for planar occluders and receivers Use this bound as the fraction of shadowed points Computing the moments HEZgt 02EZ2gt EZgt2 Shadow map store depth and depth squared Compute expected values by averaging filtering Mipmapping or custom filters in fragment shader Variance shadow maps Figure 3 Left to right 1 standard shadow mapping 2 5x5 percentage Closer ltering 3 5x5 bilinezu percentage closer ltering variance shadow maps with 5X5 separable gaussia n blur Artifacts View dependence shadow map sampling rates vary across the scene Shadow map texels are large when you are close to objects Shadow maps are square in light space View dependence Solution make shadow maps square in the camera normalized device coordinates Adaptive shadow maps ASM perspective shadow maps PSM trapezoidal shadow maps TSM cascaded shadow maps CSM parallelsplit shadow maps PSSM Shadow volumes vs Shadow maps Shadow mapping No mesh requirements open or closed Can even cast shadows from alpha test billboards Sampling artifacts Need 6 maps cube for point lights No problem with camera inside shadow Shadow volumes Close meshes only No sampling artifacts Handles spotlights point lights or directional lights What color is a shadow Wrong answer black Bad answer grey Better answer ambient color Ia Laka May be too dark neglects indirect illumination More realistic answer ambient a little diffuse 1g 01 1d Transformation Matrices 0001 00 s20 OSyOO sxOOO 0001 0010 09 s 00 mmOO cs 0001 0010 0100 1000 0010 0100 1000 Transformation Matrices PX p Py V Vy P V 1 0 Identity Scaling SK 0 0 0 s 0 sy 0 0 0 0 2 0 1 0 0 0 0 0 0 1 10 1 0 0 0 0 1 0 0 0 0 1 mm xx SP7 Sypy Sv Syvy IppIvv Szlpz S v2 Translation Rotation 1 O O X 0050 isine O O T0 1 0 y R7 sine c050 O O 0 0 1 I 7 0 0 1 0 0 0 0 1 O 0 0 1 pxtx pxcoseipysine vxcoseivysine TP pyty Tvv RP pxsineerycose Rv vaineJrvycose Pztz P V 1 1 0 Inverse Transformations M 1MMM1I Identity Scaling O Okay i o o 3 o o o ONM o 0 Translation Rotation i OOO V OOHO HOOO V RTVRT View Matrix camera position and orientation x Vy V2 qx X uy uz qy gluLookAtpx py pz X sly 1 qz o o 1 pxlookx pylooky pzlooltz rx G r 010 0 ipx upx upy upz uX My uZ O O 1 O ipy 71X fly 712 0 0 0 1 7p OR 0 0 0 1 0 0 0 1 I glMultMatrle R 1 0 0 Px ngranslatef px py pz RTT 0 1 0 Pr 0 0 1 172 0 0 0 1 Later it will be useful to have the view matrix in an array r right so call u up glGetFloatVGLMODELVIEWMATRIX V l look before applying any modeling transformations p camera position Combining Transformations Modeling Transformations Rotate about center D PE Rotate then translate ngranslatef glRotatef glutSolidCubelO Orbit about origin 1 E l 1 PERW Translate then rotate glRota ngran glutSo tef slatef lidCubelO Combining Transformations Viewing Transformations AB 13 1A Behavior Behavior First person camera Orbit camera about origin rotate about camera position arggslta gbmndgige Rotate then translate glRotatef ngranslatef ngranslatef glRotatef Projection Matrix Transforms viewing volume frustum into a cube glFrustum gluPerspective 1 0 0 0 71 72 70 1 0 0 0 0 P70 0 73 74 x1 71 x2 72 0 0 71 0 1 1 Iw 71 71 lei 0 sz 0 ng Moving between coordinate systems Modeling matrix M Local coordinates gt world coordinates View matrix V World coordinates gt eye coordinates Projection matrix P Eye coordinates gtclip coordinates Perspective division 1w Clip coordinates gtnormalized device coordinates 1 1 cube Viewport transformation NDC window pixel coordinates Modelview Matrix Modelview matrix VM Transforms vertex positions Transforms normals actually the transpose ofthe inverse ofVM Be careful with lighting GLNORMALIZE GLRESCALENORMAL Transforms light positions 39 VM I light has fixed position in eye coordinates 39 Only M I light has fixed position in world coordinates OthenNise use M to position or animate light see page 62 in notes Texture Mapping Multiple textures per primitive Used for not only diffuse color but other material properties Texture Mapping Multiple texture coordinates GLMAXTEXTURECOORDSARB void glMuItiTexCoord2fGLenum target GLfloat s GLfloat t glMatrixModeGLTEXTURE Multiple texture images GLMAXTEXTUREIMAGEUNITSARB void ngexlmage2D ngindTexture Depth buffering Depth testing glEnablengisable GLDEPTHTEST Depth comparision function ngepthFunc GLLESSGLGREATERetc Depth buffer writing ngepthMask GLTRUEGLFALSE Depth precision zfighting problems Move near clip plane out far clip plane in Nonlinear relation between position and z buffer value due to lw More precision at near clip plane Moving near clip has larger effect Lighting and Materials We won39t be using the OpenGL lighting model We can still use OpenGL properties for passing variables to the GPU GLLGHTPOSITION GLAMBENTCOLOR Image primitives We won39t draw image primitives since it is faster on current HW to draw textured quadrilaterals But it is nice to be able to grab frame from your app glReadBufferGLBACK glPixelStore GLPACKALIGNMENT 1 glReadPixelsO O w h GLBGRAEXT GLUNSIGNEDBYTE pArray Pipeline glFIush Force execution of all commands in pipeline Returns immediately glFinish Flush pipeline glFinish does not return until the effects of all previously called GL commands are complete Benchmarking glClear glFihish start timer draw stuff glFihish ehd timer Use a high resolution timer like QueryPerformahceCounter not the glut timer Check for errors inline void glError GLenum errCode const GLubyte errString iferrCode glGetError GLNOERROR errString gluErrorStringerrCode printfquotOpenGL Error snquot errString OpenGL Pipeline PerVamx Operallms Fragmenl Pmmsing Unpack angle 1 Pixel Pack Piqu Grouns Uemces Fragmenls Ieuures After the fragment processing stage Scissor Test void ngcissorx y width height in window coords Alpha Test void glAlphaFuncfunc ref Stencil Test void ngtencilMaskmask which bits to write to void ngtencilFuncfunc ref mask test between stencil void ngtencilOpfail zfail zpass what to do in 3 cases Depth Test Blending Dithering Logical Operations void glLogicOpGLenum opcode logical operation that is applied between the incoming RGBA color and the RGBA color at the corresponding location in the frame buffer Shading and Lighting Shading models Fat Gouraud smooth Phong Lighting models Phong BHnnPhong pervertex implementation perpixel implementation Shading Flat shading Use triangle face normals for lighting Solid color per triangle nghadeModelGLFLAT Gouraud Use vertex normal for lighting Color computed for each vertex interpolated over triangle nghadeModelGLSMOOTH Phong lnterpolate parameters over triangle compute color in fragment shader Color computed per fragment Shading Flat Gouraud Phong Shading Local lighting model Local light models compute appearance based on relations between each point on the surface with the viewer and light sources No volumetric scattering no interreflections between objects no shadows 12 view vector 7 n normal vector I light vector r reflection vector Unit vectors Pointing away from surface Viewer models Infinite viewer v 100k Faster since view vector is the same for all vertices glLightModeli GLLIGHTMODELLOCALVIEWER GLFALSE Local viewer e eye position p point on surface 39 V 6p6p glLightModeliGLLIGHTMODELLOCALVIEWER GLTRUE b Light Sources Directional Light Model of a light infinitely far away The sun can be approximated at a directional light Light vector Z is constant Point Light Characterized by position Light vector must be computed for each computation Light Attenuation Physically accurate attenuation Attenuation factor 1d2 cl is distance from light to surface In practice this is very harsh falls off too quickly Quadratic Attenuation with softer fall off Attenuation factor 1a bd ca 1 101d01d2 aquotH N Phong Lighting Three terms Ambient Constant color Bad approximation of global illumination Diffuse Lambertian model maxnl 0 Independent of view direction Dependent on angle between incoming rays and surface normal Specular Gives highlights I max F39Va 0 a Shininess Global Illumination Effects Interreflections Shadows These effects ignored or greatly simplified Phong Ambient Term Simulated indirect lighting Does not depend on position or normal IkL a a a ka material ambient color La light ambient coor la intensity at surface Lambertian reflection Light emitted from source gt light received at surface Depends on cosine of angle between normal and light direction Models how light spreads over area Light received at surface gt light reflected from surface Does not depend on view direction Uniform distribution of 39microfacet39 normal vectors Phong diffuse term Attenuated Lambertian reflection kd I max nZOL d abdcd2 d kd material diffuse color Ld light diffuse color Id intensity at surface Phong Specular Term Not physically based Reflection direction is geometrically accurate Equation simply generates a plausible highlight about the reflection direction disshininess I ks ltlt gt 0gtL max r39v S abdcd2 S Phong lighting equation 1 IIaka abdcd2 dedmaxnZ OkSLSmaxrvo 0 BIinnPhong Modified the Phong model so that the specularterm is estimated quickly Less precise but faster Halfway vector h 2lv Replace rv with hn 39 This is the lighting model provided by OpenGL fixed function pipeline eeeeeeeeeee em BlinnPhong Phong BlinnPhong When all vectors lie in a plane 396 angle between n l 011 angle between nlz 9lJ angle between b l Since by clef b is balfWay between l and v the angle between l1v is also 9lJ lee t0tal angle between l v is 29l 396 angle between n l By ideal re ection law the angle between n and r is also 9 up angle between r v lee t0tal angle between l v is 26p Angle between nlz is prop t0 angle between lav 2w Coordinate system for lighting We need all vectors to be in the same coordinate system in order to compute lighting Some choices Object local coordinate system Tangentspace World coordinate system Eye coordinate system Which coordinate system to use Object local coordinate system Transform lights using inverse of object modeling transformation perobject Tangentspace Transform lights into tangent space pervertex or perfragment Tangent space normals are constant Useful in normal mapping World coordinate system Transform vertices and normals using object modeling transformation Eye coordinate system Transform lights using view matrix Transform normals using normal matrix ModelView matrix inverse transpose Coordinate systems f matrix T transforms points from space A to space B then T391 transforms points from B to A If T transforms points from Ato B and S transforms points from B to C then ST transforms points from A to C If M transforms points then M39T transforms normal vectors MT M1T glMatrixInverseTranspose More coordinates systems What matrix transforms points from object space to eye space VIVI gIModeIViewMatrix Modelview Matrix VM GLSL built in matrices For transforming points WI k glModelViewMatrix glProjectionMatrix I p gModelViewProjectionMatrix GLSL built in For transforming points matrices gModeViewMatrixnverse gProjectionMatrixlnverse bji 1 chjr d Sm I p gModeIViewProjectionMatrixlnverse GLSL built in matrices For transforming normal vectors pose was i gl ModelViewMatrixlnverseTrans gProjectionMatrixlnverseTranspose glModeViewProjectionMatrixlnverseTranspose Also glNormalMatrix is the upper 3x3 part of gModeViewMatrixnverseTranspose GLSL built in matrices For transforming normal vectors WW9 gModeViewMatrixTranspose gm gProjectionMatrixTranspose I J p gModeViewProjectionMatrixTranspose Eye space lighting Eye position 0001 Surface point pe VMpm View vector ve pe Ilpell Light position Le Light vector Ie Le pe llLe pell Surface normal vector ne VM39Tnm up look Eye frame Object space or model space lighting Eye position em MV 391 0001 Surface point pm View vector vm empm empm Light position Lm MV1Le m Light vector m Lm pm Lm pmll Surface normal vector nm X Model Frame m Particles on the GPU Particle simulation Vertex processor Fragment processor Particle simulation on the vertex processor Most intuitive approach Particles are point primitives Can use vertex attributes to store particle state No write access to memory glVertex does not change Geometry shader can write to memory Particle simulation on the fragment processor Rendertovertex buffer Store particle state in floating point textures Position velocity size etc Rendertotexture Attach the FP texture to FBO Use the texture as vertex attributes Ping ponging buffers No simultaneous readwrite access to texture memory Cannot do this Can39t read from texture attached to currently bound FBO Can39t render to a texture currently bound to GLTEXTURE2D Ping ponging buffers No simultaneous readwrite access to texture memory No error but results of the read are undefined Use pingponging Even iterations Odd iterations Drawbacks storage requirements double Overhead of swapping Three approaches to ping ponging Two FBOs two textures Only one FBO bound at a time Switch using ngindFramebuffer One FBO two textures Only one texture attached to FBO at a time Switch using glFrameBufferTexture One FBO two textures Textures attached to different FBO color attachements Only render to one attachment at a time Switch using ngrawBuffer This list is in order of increasing performance for Nvidia HW Updating particle positions Particle state stored in texels of a FP texture We want to only update each particle once per frame Need a 11 correspondence between texels and fragments Draw a view aligned textured quadrilateral glMatrixModeGLMODELVlEW glLoadldentityO glMatrixModeGLPROJECTION glLoadldentityO glOrtho00f 10f 00f 10f 00f 10f glViewport0 0 TexSizeU TexSizeV ngeginGLQUADS ngexCoord2f00 00 glVertex2f00 00 ngexCoord2f10 00 glVertex2f10 00 ngexCoord2f10 10 glVertex2f10 10 ngexCoord2f00 10 glVertex2f00 10 glEnd Particle simulation on the fragment processor In fragment shader Read attributes Update attributes Write attributes to FBO attached textures uniform sampler2D PosZD uniform sampler2D Vel2D uniform float StepSize uniform mat4 ScaleBiasP uniform mat4 ScaleBiasV void main vec4 posO ScaleBiasPtexture2DP082D glTeXCoordOst vec4 velO ScaleBiasVtexture2DDir2D glTeXCoordOst vec4 vel l UpdateVelocityvel0 vec4 pos l posO StepSizeve1 glFragDataO pos l glFragData1 vel l How to render particles Variables are in texture but we need them as vertex attributes We need a rendertovertex buffer mechanism Can39t do this directly but there are at least three ways to achieve this by copying data Render to vertex buffer Approach one glReadPiers Slow GPUCPU copy Approach two Pixel buffer objects PBO FastGPUGPU copy Approach three Vertex Texture Fetch Look up position in texture during vertex processing Render to vertex buffer Approach 1 Copy from GPUCPU Then copy from CPUGPU Slowest don39t need to do this anymore gIReadBufferGLCOLORATTACHMENTnEXT gIReadPixels0 0 w h format type ampdata gIBindBufferGLARRAYBUFFER vboid gBufferDataGLARRAYBUFFER size ampdata GLSTREAMDRAW Render to vertex buffer Approach 2 Pixel Buffer Objects PBOs Core feature of OpenGL 21 When a buffer object is bound to the PIXELPACKBUFFER target commands such as ReadPixels write their data to the buffer object When a buffer object is bound to the PIXELUNPACKBUFFER target commands such as DrawPixels and TexlmageZD read their data from the buffer object Render to vertex buffer Approach 2 Fast GPUGPU copy using PBO gIReadBufferGLCOLORATTACHMENTnEXT gIBindBufferGLPlXELPACKBUFFEREXT vboid glReadPixels0 0 w h format type 0 gIBindBufferGLPlXELPACKBUFFEREXT O Render to vertex buffer Approach 3 Vertex texture fetch VTF In shader model 30 GeForce 6 Series and newer Set up Create a VBO for 2D points Set point locations to be texture coordinates const int w 256 const int h 256 float data new float2wh forint i0 iltw i forint j0 jlth j data2ihj floatiw data2ihj1 floatjh gIBindBufferGLARRAYBUFFER vboid glBufferDataGLARRAYBUFFER wh2sizeoffloat data GLSTATCDRAW glVertexPointer2 GLFLOAT o BUFFEROFFSETO Render to vertex buffer Approach 3 Rendering Bind position texture and other textures containing attributes Render vertex buffer object Read attribute from texture in vertex shader In OpenGL gIActiveTextureGLTEXTUREO gIBindTextureGLTEXTURE2D positiontex glUniform lipositionloc O mgIDrawArrays In glsl uniform sampler2D positiontex uniform mat4 ScaleBias void main vec4 pos ScaleBiastexture2Dpositiontex gVertexst glPosition glModelViewProjectionMatrixpos GPU Particles Overview Init FBO old and new floating point textures for particle attributes VBO Pass 1 Compute positions OpenGL render viewaligned quadrilateral FP Read particle position from old texture FP Write particle position to new texture Pass 2 Render particles VP Read glPosition from new texture FP Render particles 77 ll Swap old new texture IDs Goto Pass 1 Water simulation and rendering Water simulation animation Water and fluid simulation Surface waves Sine waves Modified sine waves Gerstner waves Water rendering Fresnel39s Law Snell39s Law LambertBeer Law Caustics Fluid simulations Ron Fedkiw etal httpphysbamstanfordedufedkiW Water animation Water surface mesh displaced by height map No splashes Simple model for realtime water No FFT M Finch Effective Water Simulation from Physical Models GPU Gems 2004 J Guardado Rendering Water Caustics GPU Gems 2004 on line httpdeveloperdownloadnvidiacombooksHTM Lgpugemsgpugemsch01html Hinsinger D and Neyret F and Cani MP Interactive animation of ocean waves Proceedings of the 2002 ACM SlGGRAPHEurographics symposium on Computer animation pp 161166 2002 Tessendorf J Simulating ocean water Siggraph Course Notes 1999 VVaves Sum of sine waves 39 HxytZ1AisinwiDi Aamplitude n x y w frequency D direction crest travels in x y plane phi speed Simple to compute tangent bitangent normal M 2 ivgm SMW DI x 4gt1t1 Hxytz1 2A1 These waves may still be too smooth k Gerstner waves x x21QiADixcoswipi y Jgtgt Pxyt yZ1QAD 30 in tiny Jgtgt x y M n 11 1 x y Q controls steepness of the waves Implementing Waves Large scale waves Vertex displacement in vertex shader Small scale waves Normal mapping in fragment shader Precomputation Precompute terms and store in textures Filtering and mipmapping for antialiasing Water rendering Reflection Refraction Absorption scattering Caustics Reflections Planar approximation Reflected eye 39 The image of a planar reflection is the image seen by a camera reflected about the plane For reflections about the xy plane the reflection matrix is O O O l O O l O O I O O p A l O O O Rendering Reflections Planar reflections eye Reflected eye When rendering from the reflected camera do not draw objects below the water surface User defined clip planes glEnableGLCLlPPLANEi glClipPlaneGLCLlPPLANEi const GLdouble equation Set glClipVertex in glsl Rendering Reflections The plane equation The clip vertex position xyzw The clip plane coefficients p1 p2 p3 p4 Clip vertex if xp1 yp2 zp3 wp4lt 0 For water in the xy plane double plane4 00 00 10 00 water level at z0 glEnableGLCLlPPLANEO glClipPlaneGLCLlPPLANEO plane Reflection mapping Render reflected image to texture Texture the water surface with perturbed projected texture coordinates vec4 reflectionCoIor texture2D reflectionMap ScreenTexCoordxy 001 nxyScreenTexCoordz Refraction Snell39s law Rays bend when traveling through interfaces n is index of refraction V nair 100 133 I rlwater si1191n2 sin 92 n1 To compute the reflected ray in glsl vec3 refractve03 v1 vec3 n float ratio Ratio r11r2 Refraction Fresnel39s law How much light reflected vs refracted Schlick39s approximation RRO1 RO1 nv5 where 71 72 n1n2 Then color Rreflectedcolor 1Rrefractedcolor R0 Refraction eye wielded ray Real refraction with local object is difficult Refracted ray may hit objects occuded from the eye No problem with infinite environment cube map Fake refraction Render undenNater scene to texture Render from texture with perturbed texture coordinates Fake Refraction vec4 refractionColor texture2D refractionMap ScreenTexCoordxy 001 nxyScreenTexCoordz Absorption Lambert beerlaw For liquids 1 distance light travels a absorption coefficient For gasses fog Modelling water absorption Pass varying depth from VP to FP float fog 10 pow100 alphadepth vec4 finalcolor mixoolor FogColor fog Fade to fog color not necessarily black or 001 or 002 Caustics Lensing effect from reflective and transmissive objects Fake Caustics Lighmap V Wm Equation Ground Plane Shoot ray from ground plane up to water Refract ray using water surface normal Lookup color in light map using refracted vector The lightmap Cube map color raised to a power to emphasize the sun color10 color20 color 0 Or precompute and store in alpha channel Fake Caustics Other approaches ray tracing projected textures Foam Apply using texture mapping E Darles B Crespin D Ghazanfarpour Accelerating and Lasse Staff Jensen Roben Gona s Deep enhancing rendering of realistic ocean scenes The 15th Water Animation and Rendering International Conference in Central Europe on Computer Graphics Visualization and Computer Vision392007 Spray Particle systems Lasse Staff Jensen Robert Golia39s Deep Water Animation and Rendering Wakes Displacement and normal mapping asun L thchEH Reamme Symhesws and Rende ng at Ocean Water An Research Techmcm Repun Aer 2mm GOd rays Crepuscularrays Crepuscular rays God rays Render to texture radial blur Radial Blur Mesh processing Surface representations Data structures for triangle meshes Mesh subdivision Mesh simplification Surface representations Unstructured point sets Implicit Parametric Piecewise polynomial Splines Piecewise planar triangle meshes Procedural Constructive Solid Geometry CSG Lsystems Shape grammars Unstructured point sets No connectivity Acquired by laser range scanners Useful for very large datasets LA 154quot 10 quotl 5 Rendered by splatting accumulating and blending ellipses in image space Implicit surfaces Represented analytically Spherex2y222 r20 In general Pxyz const defines an isosurface Or represent as discretized samples Useful for morphing and blending surfaces Easy intersection tests Implicit surface rendering Two options Extract mesh g g g Marching cubes Marching tetrahedra g Define cells of adjacent voxels 8 voxels per cell for marching cubes Determine surface intersection with each cell by linear interpolation Join intersection points with triangles cases shown above Volume rendering Design transfer function to visually extract isosurfaces Parametric splines Weighted sum of basis functions Example Cubic Bezier patches The cums Bezier msis iunmiar s Bgiu1 u3 i 13iu3u1 u2 8 x Bgiui3u21 ui Z 7 B u u3 I 39 K I Parametric splines Most surfaces are too complex to be modelled with a single patch Patches can be joined smoothly by constraining the derivatives to match at patch boundaries Parametric surfaces Can be compactly represented in terms of control points coefficients Infinite zoom You can evaluate at more points to get a smooth surface at any level of detail Easy to compute derivatives tangent bitangent normal Difficult intersection tests Need multiple patches or high order equation for complex models Triangle meshes Triangle meshes can be seen as a sum of piecewise planar basis functions Triangle soup Face descriptions only No shared vertices Indexed triangle list Only store unique vertices These are fine for rendering but not useful for mesh processing We want to efficiently navigate a mesh Given a vertex which faces surround it Given an edge which faces are on either side Structured meshes Doubly connected edge list DCEL Halfedge data structure Supplied by external libraries OpenMesh Computational Geometry Algorithms Library CGAL Doubly connected edge list DCEL Entities Faces not necessarily triangles Halfedges Form counterclockwise paths about faces Split each edge into an oppositely directed pair Vertices most important entity In a DCEL the halfedge is the Doubly connected edge list DCEL For each halfedge 1 store references to The vertex at its head 2 Its adjacent face 3 The next halfedge of the face 4 Its opposite halfedge 5 Optional The previous halfedge of the face For each face 3 store a reference to One of the halfedges 1 any halfedge will do For each vertex 2 store a reference to One of the outgoing halfedges 4 These references may be pointers or array indices Doubly connected edge list DCEL Now we can easily traverse a mesh List edges around a face Trivial since the edges form a linkedlist firstedge f1 halfedge edge firstedge do listappendedge edgeedgenext while edge firstedge Doubly connected edge list DCEL Now we can easily traverse a mesh List vertices around face f1 firstedge f1 halfedge edge firstedge do istappendedgehead edgeedgenext while edge firstedge List faces around face firstedge f1 halfedge edge firstedge do if edgeopposite NULL istappendedgeoppositeface edge edgenext while edge firstedge Doubly connected edge list DCEL Now we can easily traverse a mesh List all vertices connected to a given vertex by an edge v5 This is called the onering of the vertex firstedge v50utgoing edge firstedge do listappendedgehead edge edgeoppositenext while edge firstedge These operations will be useful when implementing mesh subdivision and simplification algorithms Computing Properties using the DCEL The DCEL simplifies computation of several differential properties Vertex normals Traverse adjacent faces and accumulate face normals Recall face normal can be computed using cross product Normalize to get unit vector Computing Properties using the DCEL Curvature For parametric surfaces it is the rate of change of the normal vector A discrete approximation a point is given by Procedural representations CSG Lsystems Shape grammars CSG Boolean operations applied to simple primitives IOv Commonly stored as a tree n v 0 v I L systems Generate shapes by string rewriting Example Sierpinski triangle AB are straight lines variables A B constants start A rules A gt B A B B gt ABA angle 60 are rotations by angle Lsyste ms Useful for organic plantlike shapes Shape grammars Symbols are shapes Shapes can be parameterized Production rules can include transforming symbols and setting parameter values Much work involves describing styles of architecture Mesh Operations Mesh smoothing Mesh reparameterization For texture coordinate generation For remeshing Mesh Smoothing Apply the mesh Laplacian operator Related to Gaussian convolution in image processing 71 1 n pi1 2 p onne ring Mesh Parameterization Find a smooth invertible mapping from the mesh to the plane Mesh Parameterization Applications Automatically generate texture coordinates Problem minimize distortion Remeshing Mesh Simplification Repeatedly apply simple mesh operations which reduce the complexity of the mesh W A quotiii an v A is m a t I 7A A Mesh Simplification Repeatedly apply simple mesh operations which reduce the complexity of the mesh Edge collapse Edge split Edge swap edge collapse edge Split edge swap Mesh Simplification Strategies Make sure the operation is legal Don39t collapse edge if it results in an inversion Collapsing p to q is illegal it causes the mesh to fold over itself Check that the shape is preserved Surface shape should be retained Keep sharp corners and creases Don39t move to far from original mesh Collapse short edges first Mesh Subdivision Principal Cutting corners Generate a smoother mesh from rough control mesh M0 rvi1 Apply iteratively to get a smoother mesh Sometimes the limit surface after an infinite number of iterations has provable smoothness characteristics CatmuII Clark Subdivision Input Mesh 1 For each face add a face point f fis the centroid of all original points of the face CatmuII Clark Subdivision 2 For each edge add an edge point e e is the average of the two neighboring face points and the two original endpoints of the edge 3 Connect face points to edge points CatmuII Clark Subdivision 39 4 Move original points P to P39 39 F is the aVerage of n face points touching P R is the average of all n edge midpoints for edges touching P F2Rn 3P n P The limit surface for CatmuIIClark subdivision is known to be a cubic Bspline Other subdivision schemes Loop Butterfly CatmullClark DoeSabin Adding Creases Subdivision surfaces can be unnaturally smooth Tag certain edges to be sharp Modify the subdivision rules Crease edge tagged in red Subdivision for Character modeling Textures Types Formats Samplers Access functions Deferred Shading Types Signed or unsigned Byte lnt Shortlnt float half float 16 bit float GLARBhalffloatpixel Packed types Version gt 12 GLUNSIGNEDSHORT565 GLUNSIGNEDSHORT5551 And many more Internal Formats GLFORMATSIZE ex GLALPHA4 Sizes 4 8 16 32 check spec Formats ALPHA LUMINANCE LUMINANCEALPHA INTENSITY RGB RGBA DEPTHCOMPONENT OpenGL gt 14 COMPRESSED types LUMINANCE RGB SRGB colorspace OpenGL gt 21 SRGB SRGBALPHA sRGB Colorspace standard RGB A perceptually uniform colorspace Perceived intensity is proportional to sRGB values Related to gamma correction Monitor doesn39t output linear intensities ICVye l Intensity C contrast V pixel value epsilon black level brightness sRGB Colorspace Gamma correction 1 VCV sRGB specifies different gamma for different intensity ranges 1292 V forVlt 00031308 V0 1055 V 0055 for V gt 00031308 As a result doubling the color component values in an sRGB texture will result in the perceived intensity doubling Texture Formats You don39t always get what you ask for OpenGL Internal Formal NVAD NVM Exlension RGBB RGEA OpenGL1 1 R6510 RGBAE RGB OpenGL11 RGBIZ RGBA RGBI OpenGL1 1 R6516 RGBAI R655 0 GL11 BAZ RGBAI PGBAA Opensu 1 RGBA4 Openeu 1 Raga11 OpenG 1 RGBAB OpenGL 1 1 RGBKLAZ RGEAI 965118 0 ans 1 1 RGBA12 RGBAE Rsau Openeu 1 RGBA15 RGEAI RGBAI 0 e716 1 FLOA LR IG rmF RSZF Nvi oatibnffer FLOA LRBQ RSZF n31 Nvi uatibuff r FLOA LRGIG Vi aatib ff FLOA LRGEMB RGBAISF Reamer Nvi aatibn 39er FLOA LRGEMB NV at buffs FLO113532 ijoatibuffer LOATiRGBSZ RGBASZF wsaAszr Nvi aatibuffer FLOA LRGEAQZ v70 alibu REBAiFLOAT 2 ATLtexmrei oat RGBJLOAUZ RGBAlZF RGEAHF ATUexmrejoat ALPHAJLOATsz A manure out NTENS T LFLOATEZ LUM NANCEiFLOATaz A MINANCEiALPHAiFLOATSZ RGBAMF RGEAJZF ATLtexmrei Dat RGBAiFLOA M 6 l ATLtexmreJloat See httpdownloadnvidiacomdeveloperOpenGLTextureFormatsnvogtextureformatspdf GLSL Samplers sampler1D 2D 3D samplerCube sampler1DShadow sampler2DShadow Depth textures with comparison sampler2DRect sampler3DRect SamplerZDRectShadow Bind to GLTEXTURERECTANGLEARB Use unnormalized texture coordinates Access functions texture1D 2D 3D textureCube texture1DProj 2DProj 3DProj texture1DLod 2DLod 3DLod CubeLod Arguments sampler coord lod Only in vertex shader Why Normally LOD mipmap level is determined at the fragment processing stage texture1DProjLod 2DProjLod 3DProjLod Depth Textures ngexParameter DEPTHTEXTUREMODE Treat depth values as other formats during application Luminance intensity alpha TEXTURECOMPAREMODE Return boolean results of a comparison instead of depth values NONE COMPARERTOTEXTURE tex coords strq in OpenGL tex coords stpq in GLSL since rred TEXTURECOMPAREFUNC LEQUAL GEQUAL LESS GREATER EQUAL NOTEQUAL ALWAYS NEVER Depth Texture Access Functions Commonly used for shadow mapping with COM PARERTOTEXTU RE shadow1D 2D return 00 or 10 as comparison results R coordp shadow1DProj 2DProj return 00 or 10 as comparison results R coordpcoordq shadow1DLod 2DLod shadow1DProjLod 2DProjLod Deferred Shading Deering Michael Stephanie Winner Bic Schediwy Chris Duffy Neil Hunt quotThe triangle processor and normal vector shader a VLSI system for high performance graphicsquot ACM SIGGRAPH Computer Graphics 22 4 21 30 1988 Also see 6800 Leagues Under the Sea Nvidia deferred shading presentation Avoid shading until pixel visibility has been determined Render lighting parameters to textures then compute fragment colors fat framebuffer or geometry buffer gbuffer use multiple render targets Position or depth Normal Base diffuse color Material properties Alternatives to Deferred Shading Most common use scenes with many lights Approach 1 Single pass Pass light attributes as uniforms Loop over all all light sources in shader Problems Pixel shading is expensive Overdraw you may simply ovenNrite your expensive pixel with another one Only a few lights significantly effect the appearance of a pixel Approach 2 Multipass Loop over all light sources Render scene and add to framebuffer Problems Retransforming and rasterizing geometry Overdraw Deferred Shading Setup Defining the gbuffer What spatially varying parameters will you need to do shading Normals Normal mapping diffuse and specular lighting Diffuse color Shininess specular lighting Create framebuffer object and attach textures for gbuffer Create framebuffer object and attach textures for light accumulation G buffer One possible layout N Diffuser Diffuseg Diffuseb Amb Occ Positionx Positiony Positionz Emissive Normalx Normaly Normalz Shininess 8 bits per component won39t be enough Use 16 or 32 bit float per component Deferred Shading Stages 1 Geometry phase Renderto Gbuffer 2 Lighting phase Render to light accumulation buffer 3 Postprocessing phase Geometry Phase Bind geometry framebuffer object Draw geometry In fragment shader write to gbuffer glFragDataO vec4 position 00 glFragDatai vec4 bumpNormal 05 05 basea glFragData2 vec4 basergb 00 Unbind geometry framebuffer object Lighting Phase Bind light accumulation framebuffer object Bind gbuffer textures so we can read from gbuffer Use additive blending nglendFunc GLONE GLONE Draw the light volumes more later ln fragment shader Look up normal diffuse color etc from textures Render lit colors Unbind light accumulation framebuffer object Draw lighting volumes Ambient light and Directional light fall everywhere Draw fullscreen quadrilateral For attenuated lights Draw a quadrilateral or point which covers the range of the light Or render 3D light volume shapes Ambient term and Directional light full screen quad Point light sphere Spotlight cone Draw with back face culling Why The stencil buffer A logical buffer usually 1 to 8 bits which can determine where drawing happens on a pixelbypixel basis Stencil operations happen near the end of the pipeline after fragment processing and even after alpha testing Stencil buffer applications Deferred shading light volumes Shadow volumes Stencil shadows Portals mirrors Screen overlays Ul Screenspace transitions dissolve Initialization of the stencil buffer Be sure you have a stencil buffer glutInitDisplayModeGLUTRGBA l GLUTDOUBLE l GLUTSTENCIL You probably want to clear the stencil buffer every frame 39 glClearStenCilO set the color to clear to I glClearGLCOLORBUFFERBIT I GLDEPTHBUFFERBIT I GLSTENCILBUFFERBIT Clear the buffers Enabling disabling glEnablengisableGLSTENCILTEST The stencil comparison function ngtencilFuncfunc ref mask func thetestfunction The same options as depth test functions GLNEVERGLLESSGLLEQUAL GLJMJNAYS ref the reference value for comparisons mask bitmask ANDed with stencil value and reference value before the comparison Writing to the stencil buffer Stencil buffer contents can be modified by drawing geometry Stencil buffer operations can also differ depending on depth test 3 cases When stencil test fails When stencil test passes and depth testing fails When stencil test and depth test both pass Writing to the stencil buffer ngtencilOpsfail zfail zpass Specify the action to take in the 3 cases Stencil test fails stencil passes and depth fails stencil passes and depth passes GLKEEP do nothing GLZERO GLREPLACE replace with ref GLINVERT bitwise inversion GLINCR increment value clamps to max value GLDECR decrement value clamps to zero GLINCRWRAP GLDECRWRAP wrapping behavior Usualuse Pass 1 Draw to stencil buffer without updating the color buffer or depth and stencil test always passing glCOlorMaskGLFALSE GLFALSE GLFALSE GLFALSE ngepthMaskGLFALSE glEnableGLSTENCILTEST ngtencilFuncGLALWAYS ngtencilOp draw objects Pass 2 Draw to color buffer where stencil test passes without updating stencil glColorMaskGLTRUE GLTRUE GLTRUE GLTRUE ngepthMaskGLTRUE glEnableGLSTENCILTEST ngtencilFunc ngtenCilOpGLKEEP GLKEEP GLKEEP draw objects Better light volumes using the stencil buffer Attach stencil buffer as renderbuffer object to FBO Draw back faces on zfail increment stencil glCullFaceGLFRONT ngtencilOpGLKEEP GLINCR GLKEEP Draw volume Draw front faces on zfail decrement stencil glCullFaceGLBACK ngtencilOpGLKEEP GLDECR GLKEEP Draw volume Draw front faces to color buffer where stencil gt O ngtencilOpGLKEEP GLKEEP GLKEEP gIStencilFuncGLGREATER 0 0 Two sided stencil Can do front and back faces in a single pass GLEXTstencitwoside glEnableGLSTENCLTEST glEnableGLSTENCLTESTTWOSDEEXT glActiveStencilFaceEXTGLBAC K ngtencilOpGLKEEP GLINCR GLKEEP glActiveStencilFaceEXTGLFRONT ngtencilOpGLKEEP GLDECR GLKEEP draw light volume Now light volume geometry is only drawn once Deferred shading optimizations Store only 2 in Gbuffer Compute eye space position from 2 value fragment position Store only my Hz in Gbuffer Sincen2n2n21 oucanrecom ute x y z 1 2 2 nx V ny nz Deferred shading Drawbacks Gbuffer can be huge Our example 36 MB for 1024 x 768 window light accumulation buffer and other textures Does not handle transparency Solution separate pass for alpha blending Only a single light shader Solution Render material IDs in Gbuffer and branch in shader No advantage for overlapping lights Advantages Only compute lighting for visible fragments Using Texturing with Lighting Factored lighting equations Precomputed lighting light maps Emission map Diffuse Color Map Specular Effects Gloss map Specular map Environment maps Normal mapping With environment mapping With anisotropic lighting Where to use the texture color values Every term of the lighting equation could possibly be lookedup in a texture Parameters of the lighting model Roughness Shininess BRDF Material colors Diffuse color Specular color Light colors Emission Term Stored in environment maps Normals BRDF factorization Recall that the BRDF is a 4dimensional function pvi v0 v1 incoming light direction v0 outgoing light direction Represent this as a product of 2 dimensional functions Or a sum of products of 2 d functions 0Vi v0gtN21pjvigtqjvogt Store p and q in small cube maps 32x32 Chris Wynn Realtime BRDFbased lighting using cube maps NVlDA Whitepaper 2001 Store basis function coefficients pm i2 aimm Spherical harmonics Y are basis function defined over the sphere This model can be fit to measured data by using least squares to solve for coefficients 0 SH basis functions stored in 2D texture using paraboloid parameterization of the hemisphere Red values are positive blue are negative Kautz Sloan Snyder FastArbitrary BRDF shading for lowfrequency lighting using spherical harmonicsquot Eurographics 2002 Factorized Cook Torrance Model D distribution of microfacets The Beckmann distribution depends on nh and m Precompute the value of D for all nh m pairs and store in a texture F Fresnel term Depends on nv and n index of refraction Reparameterization of BRDFs Change of variables may make the BRDF easier to compress plviwol m39lhdl h halfway vector d difference vector BRDF may be more separable in these coordinates n pVigtVogtNZJ1p39jhgtqudgt Simon Rusinkiewicz A new change of variables for efficient BRDF representation Eurographics 1998 Light maps n 1 IvkdlightmapxyZZi1Likspsliv Ambient and diffuse light intensity can vary over the scene Static lights precompute lighting offline Specular term can39t be precomputed Due to view dependence Ambient and diffuse lighting can be stored in low res textures or in vertex colors Lightmaps Emission Maps n nll kaLaZHLi7ltkdpdlt1pvgtkspsltzpvgtgt Ivk emission Glow map Models light sources on a surface Does not depend on surface normal or light sources v The Glow Map represents self illuminated areas Diffuse Color in a Texture n nL IvkaLaZi1Li7kdpdlivkspsliv vec4 kd texture2DdiffuseTex texcoordst Specular Effects Using Textures Specular colori Shininess n I l39li IvkaLaZi1LiyUcddeivkspsliv Specular color Gloss map Can be gray scale specular intensity Shininess Control the specular exponent in Phong light model Or roughness in other light models Specular Map with Specular Exp lap 6 Allmo 1 1 W 7 7 V Specular Color 4 Specular Exponent Diffse Clr 7 y m h S Dull Environment maps n I l39li IvkaLa 211 Likdpdlivkspsliv Specular light color Can use specially designed cube map for diffuse color PU V Cube mapping simply selects color from reflection direction This simulates a very shiny surface High specular exponent in Phong model Environment maps n nL IvkaLa 211 Likdpdlivkspsli v I Use texture LOD bias Mipmap level bias to simulate shininessroughness vec4 specularTerm textureCube envCube rxyz biasspecularColor l Cubemap sampler Re ection vector Roughness Cubemapping in GLSL Seperate sampler type uniform samplerCube cubemap Texture lookup Use reflection vector rather than texture coordinates VGC3 r reflect v n vec4 color textureCubecubemap r bias GLSL does cubemap face lookup and converts r to texture coordinates internally Optional bias parameter is added to mipmap level bias 10 One mipmap level higher Cubemap Mipmapping Recall that the mipmap levels are blurred and subsampled versions of the base texture This blurring allows neighboring pixels to influence the reflected color 39 39539 Shiny Rough This is equivalent to spreading out the BRDF around the reflection direction There is a well formulated theory for doing this in a physically accurate way Reflection as convolution Reflected light is incident illumination convolved with the BRDF 60 m9 00 M91 l 0 1lteogtlquotf2Lltegtplte e gtcoslteigtdei R Ramamoorthi P Hanrahan A signal processing framework for reflection ACM TOG August 2002 Environment Maps Can be combined with diffuse material color Cubemap gives specular light color for very shiny objects Blurred cube map gives specular H contribution for less shiny surfaces Correct Cubemap Filtering Poor cu bema filterin Correct Cubemap Filtering Simply generating independent mipmaps for each face results in seams Highest mipmap level is constant color Can get 6 different colored cube faces at the highest level blurs across face boundaries then Correct cubemap mipmapping i subsamples Treat the cubemap like image over a sphere ATls CubeMapGen utility will generate better mipmaps for cubemaps Normal maps t r r vkaLaZi1Li7kdpdlivkspsli12 To extract a unit normal vector from a normal map texture vec3 normal 20texture2DnormalTex texcoordstxyz 10 Diffuse texture Normal map Normal maps Give the illusion of geometric detail Shape perception depends on lighting cues Use perturbed perpixel normal for lighting Without normal mapping With normal mapping Combine with Mesh Simplification 249924 triangles 62480 niaugles 7809 niangles 975 triauules Cohen Olano Manocha Appearancepreserving simpli cation SIGGRAPibi 1998 ATI application GPU MeshMapper NormalMapper What coordinate system are the normals in World coordinates Not very flexible if object rotates you need to recompute Local coordinates Associated with a specific model Tangent Space Normal llap Tangentspace Most flexible What coordinate system are the normals in Think of the TNB vectors as a coordinate frame 90 Q1 92 q We usually have tnb expressed in objectmodel coordinates like parametric surface example We can then write q in model coordinates as q290tq1bq2n TBN matrix qo I 611 qqotq1bq2n qz Can we write this as a matrixvector multiplication Z0 1 0 no qo qotoq1b0q2no Z0 1 0 quot0 Z1 1 1 quot1 11 q0Z1q1b1q2n1q0Z1q1b1q2 quot1 Z2 1 2 n2 qz q0Z2q1b2q2n2 Z2 2 n2 More coordinate systems Modelview Matrix VM Tangent SpaCe TBN i TBN Matrix L Object i Model Matrix M Wori d Space t View Matrix V Eye Space i Projection Matrix P quotCtlip Space to b0 quot0 Q b1 m t2 b2 quot2 LEN 7 Space Vertex Coordinates If M transforms points from A to B then M391 transforms points from B to A If M transforms points from A to B and N transforms points from B to C then NM transforms points from Ato C If M transforms points then M39T transforms vectors MT M1T glMatrixInverseTranspose Ilore coordinates systems TBN Matrix L TBN What matrix transforms vectors from eye space to object space VIVIT gIModeIViewMatrixTranspose Modelview Matrix VM GLSL built in matrices For transforming points WI k glModelViewMatrix glProjectionMatrix I p gModelViewProjectionMatrix GLSL built in For transforming points matrices gModeViewMatrixnverse gProjectionMatrixlnverse bji 1 chjr d Sm I p gModeIViewProjectionMatrixlnverse GLSL built in matrices For transforming vectors pose was i gl ModelViewMatrixlnverseTrans gProjectionMatrixlnverseTranspose glModeViewProjectionMatrixlnverseTranspose Also glNormalMatrix is the upper 3x3 part of gModeViewMatrixnverseTranspose GLSL built in matrices For transforming vectors Mai 9M Why g gModeViewMatrixTranspose quxss gProjectionMatrixTranspose I J p gModeViewProjectionMatrixTranspose Inverse of the TBN matrix If TBN are mutually orthogonal then 1 T LTBNZLTBN Can often use this as an approximation In general we only have TJ N and B J N but not necessarily T J B so we can39t use transpose 3x3 Matrix Inverse Consider the column vectors ofA 6 0 b0 Co A 611 b1 01 6 2 b2 02 1 bxcgt0 bxcgt1 bxcgt2 A 1 cgtltaO cgtlta1 cgtlta2 IAI aXbgt0 aXbgt1 aXbgt2 TBN Matrix Inverse Consider the column vectors of M t0 b0 quot0 LTBNZ t1 b1 quot1 t2 b2 quot2 bxngt0 th bxngt2 LTgNm tXnO tgtltn1 txn2 TBN txbt tgtltbgt1 txbb Now use the fact that n tx b TBN Matrix Inverse Consider the column vectors of M 1 1 LTBNLTBN Use the definition of the vector triple product agtltbgtltcbac cab TBN Matrix Inverse Consider the column vectors of M 1 LTBN btt tbt LTBN n tbb bbt I tbb bbt I bbtt bt2 L 1 Z btt tbt bbgtltrrgt ltbrgt2 I l I l l l Creating Tangent Space Normal llaps Start with a bump map Intensity proportional to height Use Photoshop or Gimp plugins NVIDIA Or a standalone application ATI NormalMapGenerator CrazyBump wwwcrazybumpcom Xnormal wwwxnormalnet From photographs Compute yourself Normal Map Photography wvwvzarrianetnrmphotonrmphotohtml Take four photographs with different lighting conditions Normal Map Photography wvwvzarrianetnrmphotonrmphotohtml Combine pairs into redgreen color channels Normal Map Photography wva arria Overlay redgreen images and add a blue channel From Bump Map to Normal llap Think of the parametric surface defined by the bump map xuv yuv v zuv Iuv z is proportional to image intensity u plum What is the normal vector at each vertex First find tangent and bitangent 1 O 0 Tu vaupuv a 81 av From Bump Map to Normal llao What is the normal vector at each vertex From tangent and bitangent find normal a1 TxB 1 8 HTxBH J1Q Q f 5 81 av 1 From Bump Map to Normal Map The image derivatives can be approximated using finite differences Many possible schemes fonNard differencing gives 6 624 1uvl Iuv 6v Iu1v Iuv Finally map vector to color For b bits per color channel Lighting with the normal map Example lighting in eye space Transform each normal perpixel into eye space uniform vec4 ambientColor uniform vec4 diffuseColor uniform vec4 specularColor uniform sampler2D diffuseTex uniform sampler2D normalTex uniform vec4 L varying vec3 view uniform float shininess varying vec3 light varying vec3 normal varying vec3 normal varying vec3 tangent varying vec3 tangent varying vec3 bitangent varying vec3 bitangent varying vec3 position varying vec3 position attribute vec3 rm Tangent void mainvoid attribute vec3 rmiBitangent 7 vec3 l normalizeLxyz void mainvoid eye space tangent bitangent and normal gl Position ftransform vec3 normalizetangent gliTexCoordO gl TextureMatrixOgl MultiTexCoordO vec3 B normalizebitangent 7 7 7 vec3 N normalizenormal eye space tangent normal and bitangent vectors normal gl NormalMatrixtgl Normalxyz tangent space normal tangent gliNormalMatrixtrmiTangentxyz vec3 tn 20texture2D normalTex gliTexCoordOst xyz 10 bitangent gl NormalMatrixtrm Bitangentxyz eyespace perturbed normal position gli odelviewMatrixtgl7Vertexxyz vec3 n normalizeTtnx Bttny Nttnz vec4 diffuseTerm texture2D diffuseTex gliTexCoordOstdiffuseColormax 00 dotn l eye space reflection and view vector vec3 r reflect l 11 vec3 v normalize position vec4 specularTerm specularColorpowmax00 dotr v shininess gliFragColor ambientColor diffuseTerm specularTerm Lighting with the normal map Example lighting in tangent space Transform light and view vectors into tangent space per vertex varying vec3 v varying vec3 l uniform vec4 L attribute vec3 rmiTangent attribute vec3 rmiBitangent void mainvoid l gliPosition ftransform gliTexCoordm vec4 camera gliModelViewMatrixlnversetvec400 object space view and light vecto vec3 view vec3 light TBNinv transforms vectors from object space to tangent space l v TBNinvtlight TBNinvtview directional light in eye space gliTextureMatrixOgliMultiTexCoordO r normalizecameraxyz gl7VerteXxyz normalizegliModelViewMatrixTransposetLxyz mat3 TBNinvrm7Tangent rmiBitangent gliNormal uniform uniform uniform varying vec3 v varying vec3 l void mainvoid l l a normalizel v a normalize vec4 diffuseTerm gliTexCoordOstdiffuseColormax 00 vec3 r gliFragColor vec4 ambientColor vec4 diffuseColor vec4 specularColor sampler2D diffuseTex sampler2D normalTex float shininess V tangent space normal vec3 n a Ottexture2D normalTex gliTexCoordOst xyz 10 texture2D diffuseTex d0tn l tangent space reflection vector reflect l n vec4 specularTerm specularColortpowmax00 dotr v shininess ambientColor diffuseTerm specularTerm Computing TBN for meshes For parametric surface we simply compute derivatives For a texture mapped mesh Surface Texture Image Texture Mapping M M 1123 2 Mxyzuv Computing TBN for meshes If M is invertible then M391 defines a parameterization of the mesh M 1 R2 gtR3 M 1uvgtixyzi Surface Texture Image Texture Mapping M M R3 gt R2 Computing TBN for meshes Texture mapping is given explicitly at vertices and linearly interpolated over triangles x aubvc a b c yduevf Td Be kf z guhvi g h l39 Computing TBN for meshes x aubvc a b c yduevf Td Be kf z guhvi g h l39 p1Tu1Bv1k x1 Txu1va1c p2Tu2Bv2k y1 Tyu1Byv1f P3Tu3Bv3k z1 Tzu1BZv1i Computing TBN for meshes Take differences to eliminate the constant vector k p1Tu1Bv1k p2Tu2Bv2k p3Tu3Bv3k p2p1Tu2u1gtBV2V1gt p3p1Tu3u1gtBV3V1gt Computing TBN for meshes Rewrite as a 2x2 linear system 192 191 u2u1 V2V1 193 191 B Ti u3u1 V3V1 1 u2u1 V2V1 192 191 193 191 i2 u3u1 V3V1 Computing TBN for meshes Texture mapping is given explicitly at vertices and linearly interpolated over triangles 1 all a12 1 a22 a12 a21 a22 a116122 61216112 a21 all T2 1 V3V1 V1V2 192 191 B z u1gtv3vlgtu3u1gtV2V1gtu1u3 u2u1 193 191 Computing TBN for meshes 192 191 193 191 1 2 u1gtV3V1gtu3u1gtV2V1gt V3V1 V1V2 l2 u1u3 u2u1 TV3V1gtp2p1gtvlvzgtp3p1gt z 1gtV3V1gtu3u1gtvzvlgt l u3gtp2p1gtu2u1gtp3p1gt B z 1gtV3V1gtu3u1gtvzvlgt Environment Mapped Bump Mapping I EMBM Use the perturbed normal in environment mapping Normal mapping with anisotropic lighting Recall that anisotropic lighting equations requires tangent vector as well Construct a perturbed tangent vector t39 that is perpendicular to the perturbed normal n39 Subtract out the perturbed normal component and renormalize n A l ln39n39 lI39n39n39 Ambient Occlusion llao Precompute how much ambient light strikes each fragment Can also be computed pervertex and stored as an attribute In realtime rendering this is estimated locally In offline rendering like raytracing this can be done globally Bent normal the average unoccluded direction Also horizon mapping 4 unoccluh Partially occluded Ambient Occlusion llao Values in the map are occlusion factors 1 completely occluded 0 unoccluded Modulates the ambient term Iambient l occlusionka L a Can also modulate the diffuse term diffuse l occ1usionkd Ldmax nZ Bent Normal Bent normal the average unoccluded direction diffuse OCCIUSiOngt kd Ld max anbent39lgt Results r P X jquot i 77 ll 3 l Diffuse light Diffuse light with ambient occlusion Diffuse light with Diffuse light occlusion ambient occlusion Ambient Precomputing Ambient Occlusion Two approaches Inside looking out Shoot rays from each vertexfragment Rays are distributed randomly over the hemisphere Intersection test between ray and mesh Fraction of rays which intersect the mesh occlusion factor Precomputing Ambient Occlusion Outside looking in Camera positions are distributed randomly over the hemisphere For each position 1 to m Render mesh to depth buffer For each vertex v Begin occlusion query Render vertex as point End occlusion query Visiblev query result Occlusionv 1visiblevm Applications faogen fast ambient occlusion generator blender lrradiance llao For ambient lighting Sometimes called diffuse environment map Integral over the hemisphere of the cube map Look up blurred cube map color using normal vector Orthe bent normal float blur l O 7 occlusion vec4 color kdlOocclu5ion texCubeenvMap nibent blur Incident Illumination Irradiance Map Ambient occlusion issues Bump mapping ambient occlusion Need to take bump into consideration when computing ambient occlusion map For animated objects need dynamic ambient occlusion M Bunnell Dynamic ambient occlusion and indirect lighting GPU Gems 2 pp 223233 2005 Interobject ambient occlusion Pseudoglobal illumination J Kontkanen S Laine Ambient Occlusion Fields Proceedings of l3D pp 4148 2005 Screen Space Ambient Occlusion Render zbuffer to texture Compare depth with neighboring pixels and normal to estimate local ambient occlusion Finding creases in depth buffer M Mittring Crytek Finding Next Gen CryEngineZ Advanced Real Time Rendering in 3D Graphics and Games Course SIGGRAPH 2007 A 39 VWh SSAO VWhout SSAO Related techniques Horizon mapping Record angle to horizon at a discrete set of directions Ambient aperture lighting Ambient occlusion for single dynamic area light Light and unoccluded area f 1 are spherical caps all quot 554 Find intersection of spherical caps Texturing and Sampling Mipmapping Automatic mipmap generation Generate mipmaps on the GPU Framebuffer attached textures Anisotropic texture filtering Nonpower of two textures Multisample Antialiasing Image Representation Continuous signal represented as discrete samples Nyquist sampling rate At least 2 times the highest frequency in the image Continuous signal and undersampled reconstruction Texture Mapping 2 sampling issues in texture mapping The original texture image Sampling some imagesignal into a 2D texture array The image of a textured surface in the framebuffer The rasterization process samples the textured surface into the framebuffer This can be seen as a resampling process If the object is small the framebuffer may undersample the texture image Minification filter If the object is large the image may be oversampled Magnification filter Mipmapping Mip mutum in parvo many in a small place Representation of an image at multiple scales High spatial resolution image with high frequencies Lower spatial resolution image with low frequencies To generate mipmaps Start from the highest resolution image level 0 Smooth low pass filter Subsample take every other pixel st Texture filtering required texture reads Magnification GLNEAREST nearest texel center 1 GLLINEAR weighted average of surrounding 4 texels 4 Minification GLNEAREST GLLINEAR GLNEARESTMIPMAPNEAREST 1 Choose mipmap with texel size nearest the pixels size Do nearest neighbor filtering within that mipmap GLLINEARMPMAPNEAREST bilinear 4 Linear filtering within the nearest mipmap GLNEARESTMPMAPLINEAR 2 Nearest filtering within two mipmap levels then average GLLINEARMPMAPLNEAR trilinear 8 Linear filtering within two mipmap levels then average