Posts

Showing posts from August, 2021

sin vertex displacement shader

 The following vertex-surface shader moves vertices up and down based on some sine waves and the time variable. Combining lots of waves is a cool way to generate funky displacements. Things to note - we calculate a new normal in the vert function to prevent weird shading issues.  Shader "Dave/Unlit/waves" {     Properties     {         _MainTex ("Texture", 2D) = "white" {}         _Tint("Colour Tint",color)=(1,1,1,1)         _Freq("Frequency",Range(0,5))=3         _Speed("Speed",Range(0,100))=10         _Amp("Amplitude",Range(0,1))=0.5     }     SubShader     {        CGPROGRAM     #pragma surface surf Lambert vertex:vert        struct Input {             float2 uv_MainTex;             float3 vertColor; };         float4 _Tint;         float _Freq;         float _Speed;         float _Amp;         struct appdata {             float4 vertex :POSITION;             float3 normal : NORMAL;             float4 texcoord:TEX

slightly better outlines

  This approach to outlines is a bit more robust & adds outlines to areas within the shape's silhouette - eg details around say the nose and eyes, vs just the outline of the head. It utilises passes - first drawing the shape as normal with a standard struct input and void surf... then adding outlines on top using a vert-frag shader. This is quite cool to see both surface and vert frag shaders being mixed Things to note - we Cull the Front...We're essentially rendering the inside of the mesh as the outlines. I've found an explanation online about this shader, pasted below the code.. From the docs - UNITY_MATRIX_IT_MV Inverse transpose of model * view matrix.   I'm not entirely sure what is happening with this code - but I'm fairly sure we're taking the vertex normal and multiplying it by something, before getting it into screenspace.  Shader "Dave/Unlit/advancedOutline" {     Properties     {         _MainTex("Texture",2D) = "white&q

simple outline shader, using the vertex normal

Beneath is a method of drawing outlines around our mesh - or rather, we're expanding the mesh, giving it a flat colour and then drawing the mesh as normal on top of it. Note that we have two CGPROGRAM blocks, which get executed in order. This type of outline requires the transparent tag to work properly, and this could cause problems elsewhere.. Also it only outlines the extremities of a shape - or you could say, just the alpha edges . Next post will feature a more advanced shader    Shader "Dave/Unlit/simpleOutline" {     Properties     {         _MainTex("Texture",2D) = "white"{}         _OutlineColor("Outline Colour",color)=(0,0,0,1)         _Outline("Outline Width",Range(-0.1,0.1))=0.005     }     SubShader{         Tags{"Queue"="Transparent"}         ZWrite off         CGPROGRAM         #pragma surface surf Lambert vertex:vert         struct Input {             float2 uv_MainTex;         };         float _O

shader scroll UVs

This nice short & sweet shader scrolls the texture using time & some float values, in the x/y or u/v directions. All we're doing is manipulating the u and v values. Unity provides us with the time variable "_Time". According to the docs, the time is in Seconds & is scaled by the game's time multiplier. Beneath the shader are some docs time variables and info.    Shader "Dave/Unlit/UVSCROLL"{     Properties{         _MainTex("Texture",2D) = "white"{}         _ScrollX("Scroll X",Range(-5,5))=1         _ScrollY("Scroll Y",Range(-5,5))=1              }         SubShader{         CGPROGRAM         #pragma surface surf Lambert         sampler2D _MainTex;         float _ScrollX;         float _ScrollY;         struct Input {         float2 uv_MainTex;         };                   void surf(Input IN, inout SurfaceOutput o) {         _ScrollX *= _Time;         _ScrollY *= _Time;         float2 newuv = IN.uv_MainTex

using vertex shaders with surface shaders - "extrusion"/displace by normal?

Here's a nice & simple vert-surface shader that affects the appearance of the object's vertices. If you click the object in the editor view, you'll see the object outline remains normal - it is just the rendered appearance. The magic happens in the vert function, where you add the normal of the vertex, multiplied by the extrude amount to the vertex's original position.    Shader "Dave/Unlit/Extrude"{     Properties{         _MainTex("MainTex",2D) = "white"{}         _Amount ("extrude",Range(-1,1))=0.01     }         SubShader{         CGPROGRAM         #pragma surface surf Lambert vertex:vert         struct Input {             float2 uv_MainTex;         };                  struct appdata {             float4 vertex: POSITION;             float3 normal: NORMAL;             float4 texcoord: TEXCOORD0;         };         float _Amount;         void vert(inout appdata v){             v.vertex.xyz += v.normal*_Amount;             }

Fragment/Vertex shader Casting and Receiving shadows

 There're two parts to this - lets focus on the first Pass block -  The first block allows the shader to receive shadows. The previous ones were all flat. The LightMode tag sets it to Forward lighting, which changes it from the default Deferred lighting that Unity uses.  We have some extra #pragma stuff that I don't really understand - apparently it's there so we can have complete control over the lighting? Some extra #includes are present as well...more functions to do with lighting. In our appdata struct we're using the NORMAL now. In the v2f we call the "position"  variable "pos", because later on a TRANSFER_SHADOW function requires that specific name within the struct. It won't recognise "vertex". There's also a SHADOW_COORDS variable, to tell the shadow where to go. In our vert function we convert the vertex to screenspace, calculate its worldspace normal, then get the dot product between the normal & the light's positi

fragment/vertex shader - distorting UVs and the Grabpass

Two things shown here, with a bit of fiddling needed to make it actually work, which shouldn't be a problem if you've looked at all the other bits. In the appdata struct we're including the UVs now, with TEXCOORD0 attribute. In the v2f function, we use TRANSFORM_TEX(v.uv, _MainTex) to get our uv coordinates into screenspace. Once we have these values, we use the sin function to distort both the X&Y, or U & V!, multiplying the input by some properties we defined earlier. This is passed to the Frag shader, which displays "a" texture..  What I've not mentioned is the GrabPass -it essentially takes a snapshot of the framebuffer and stores it as a texture. The Transparent Queue tag stops it from going crazy & rendering into infinity (think of when you record a tv and feed the recording into the tv.. The GrabPass is defined as a sampler2D, which must be labelled as _GrabTexture- after that it can be used in a tex2D function as usual.  Shader "Dave/U

fragment vertex/vertex fragment shaders in unity

Fragment Vertex shaders are a different beast to surface shaders. We are able to access vertex data and control pixels in much finer detail. Below is a simple frag/vert shader. The framework is similar to the surface shaders we've been looking at, only the usual CGPROGRAM block now sits within a Pass block, that lives in the SubShader. I haven't included a Properties block, but that remains unchanged. The first thing is we now have #pragmas for the frag and vert functions and also #include the UnityCG.cginc file, that contains lots of handy shader functions There are other files we can include when we are working with lighting and shadows. Next up in the framework is the "appdata" struct - it essentially has all the vertex data that our 3d model holds. Here we've go the vertex variable as a float4 type, and we use the POSITION attribute. The "vertex" name can be arbitrary. You can also list other data, such as normals, color (multiple) and UVs & they

Stencil Buffers pt2 - object that only appears behind certain "glass"

More or less the same as what happened in the last post about stencil buffers - the few lines with Enums are useful - they "enumerate" - put all the functions into a drop down list in the Inspector for us. So we store these in variables - _SComp and _SOp. Notice the values are just numbers, as they are just entries in a list/array. A trick in the tutorial i watched was having an object only appear behind a certain "pane of glass" - or quad. The quad holds the stencil window shader and faces the object (so it faces away from the camera), and the SComp is set to always, the SOp is set to Replace. The trick is that both the object and glass have the same ref numbers, so if you had another pane of glass with a different ref number, the object does not appear behind it. The object's shader beneath, has a bump and albedo in it. Set the SComp to Equal (different to the last example we did) and the SOp to Keep.   Shader "Dave/stencilwindow" {     Properties{  

Stencil Buffers Pt 1 - cut a hole in an object

This is quite cool stuff. You can essentially cut out areas of other objects, by using the stencil buffer There are two shaders below- one for the "hole" or the "window", the "cutter outer", whatever you wish to call it. And one for the wall or thing you're "cutting" into. The main CGPROGRAM block remains simple, the changes happen at the start of the SubShader. Firstly we ensure we draw the hole before the geometry, by giving it the Geometry-1 queue tag. The smaller the value, the sooner an object is drawn. We disable writing to the zbuffer for the whole subshader, as well as with colors (colormask 0). The stencil has it's own sub block - in it, we define Ref with the value of 1. Comp is the comparision which compares the object's pixels with Ref. We "always" want to write to the stencil buffer using our cutter object's pixels, so it "always" passes this comparison & does what the Pass operation wants - i

layering/adding/blending two textures

Here we define two textures -cat and dog. I actually had a few issues using the usual _MainTex name . Something new - we use a Toggle to turn on and off the Decal texture - multiplying the dog texture by the 0 or 1 value . The thing of note is at the o.Albedo calculation, where we check if the alpha channel of the b texture (which is the dog one) has a value above 0.9 (we allow for some float-value accuracy as the alpha might not be exactly 1!), if so we use the b texture & if not, we use the a texture. We could check the red channel or blue or green if we liked..      Shader "Dave/basictextureblend" {     Properties{         _cat("cat", 2D) = "red"{}         _dog("dog",2D) = "blue"{}         [Toggle]_ShowDecal("Show Decal",Float)=0     }     SubShader{         Tags{"Queue" = "Geometry"}         CGPROGRAM         #pragma surface surf Lambert         sampler2D _cat;         sampler2D _dog;         float

hologram shader, Pass, colormask, transparent

This shader creates a fresnel style inner rim glow on an object. It uses the dot product between the surface normal and the view direction vector. To give it a transparent/ghostly feel we set the alpha to the rim-glow effect. In order to actually get it transparent, we add "alpha:fade" to our #pragma and also assign the Transparent Queue tag. A "problem" that arises is that we can see the internal faces of an object when it has a complex shape -eg the inside of the mouth cavity. This *might* be desirable in a true x-ray shader? However, if you wanted to have the outer surface occlude the inner parts, then we must add a Pass. This will take place before the pass that occurs within the CGPROGRAM. We add a Zwrite On within the pass, which enables writing to the depth buffer and set the ColorMask to 0, so we don't write any colour data - only the Z info.      Shader "Dave/Hologram"{     Properties{         _RimColor("Rim Color", color)=(1,1,1,1)

CG shaders Alpha Channels and blending

Typical alpha channel usage - for a quad with a leaf texture that has alpha values. We make sure we're inthe Transparent Queue and also use the Blend function. It takes two arguments, the first multiplies the incoming data (what is on the surface) and the second multiplies the existing data on the framebuffer (whatever is behind the object - Unity renders from back to front). Then both values are added. In the code example, SrcAlpha x surface colour will give you the surface col where the Alpha is white & zero where the Alpha is black. By multiplying the framebuffer with 1-srcalpha we take the inverse of the alpha map and get inverted results. Now when you add these values they should sit together as if the quad has correct transparency. If for instance, you had Blend SrcAlpha One, then  you'd have an additive effect of the background on your leaf. I've pasted some info about it below.. Cull Off - tells the shader not to discard the back-face data, which it does by defa

Custom toon shader lighting model thing

Here's a custom lighting model that uses a texture (ideally it will be a horizontal greyscale gradient) to create toonshading style shading bands.. You know the kind... Diffuse is calculated as in Lamber - dot product of surface normal and light direction the h value here is not the halfway vector - it is a float that takes the diffuse value, halves it and adds 0.5 to get a value between 0 and 1. Remember if the vector is facing away from the other in a dot product, it will return a value of minus one. -1 x 0.5 +0.5 will be 0. This h value is then assigned to a float2  "rh", which acts as a UV value for the variable "ramp". We now have a greyscale value from the texture. The final colour is calculated by multiplying the albedo by light colour and the ramp value. I guess you could also multiply it by the attenuation if you were after a non-flat shader too    Shader "Dave/ToonRamp"{     Properties{         _Color ("Color",color)=(1,1,1,1)      

Custom lighting models

Don't think I want to cover writing actual custom lighting models just yet. However, here is how you might implement the Lambert lighting model within your shader. Blinn is done below, with a brief explanation. Essentially they are like mini functions. They must begin with the word "Lighting" and match the name of the lighting model that you specify in the #pragma surface surf ______ Lambert works by taking the dot product of the surface normal and light direction vectors. Where the light is aligned to the surface normal, the surface will seem brighter. As it gets closer to 90degrees/perpendicular the surface will be darker. This value NdotL is multiplied by the surface colour (albedo) ,the colour of any lights (_LightColor0, a built in variable) and the attenuation, which is how intense the light is.  Shader "Dave/customLightingmodel" {     Properties{         _Color("Color",color) = (1,1,1,1)     }     SubShader{         Tags{"Queue" = &quo

Lighting models other than Lambert - BlinnPhong, Standard, StandardSpecular

So far we've just been using the Lambert lighting model in our shaders. This is all well and good for simple non-shiny surfaces - but if we do want proper specular highlights, or more physical accuracy we'll want to use some other lighting models. Blinn Phong First up is BlinnPhong. To use it, change the usual #pragma statement to     #pragma surface surf BlinnPhong Also, in the Properties block we now will be using a few more variables -  Properties{     _Color("Colour",color)=(1,1,1,1)     _SpecColor("Specular Colour",color)=(1,1,1,1)     _Spec("Specular",Range(0,1))=0.5     _Gloss("Gloss",Range(0,1))=0.5 } The _Spec/Specular controls the coverage of the highlight & the _Gloss/Gloss controls the strength of the highlight. We do not have to define the _SpecColor within the SubShader block -Unity already has this built in for some reason & might complain if you do redefine it. Datatypes are - Float4/Fixed 4 for Color, Half for Spe

CG shader - World Position, if statement shorthand, frac

The shader below uses world position to create stripes on an object. It's as simple as including worldPos in the Input struct and using IN.worldPos.y or .x or .z. We're using the shorthand IF statement to assign a green (0,1,0) and red (1,0,0) to the surface where the worldPos.y is less than _rimThreshold. The shorthand is as follows -  condition ? true result : false result IN.worldPos.y>0.5 ? float3(0,1,0): float3(1,0,0); We're also using the frac function here on the worldPos.y - this returns just the fractional part of the number. eg. 3.23 would return just the 0.23 part. so in this example, 0.23 is NOT greather than 0.5, so would return false and therefore the (1,0,0) colour. By multiplying the worldPos.y by 5 (or any number) we're essentially able to repeat the pattern by 5 times. The *0.5 is just a further arbitrary scaling! Final point - just multiplying the colour by the dot product to give the surface a bit of shape, instead of being flat. Alternatively, yo