Posts

Showing posts with the label shader

Fragment/Vertex shader Casting and Receiving shadows

 There're two parts to this - lets focus on the first Pass block -  The first block allows the shader to receive shadows. The previous ones were all flat. The LightMode tag sets it to Forward lighting, which changes it from the default Deferred lighting that Unity uses.  We have some extra #pragma stuff that I don't really understand - apparently it's there so we can have complete control over the lighting? Some extra #includes are present as well...more functions to do with lighting. In our appdata struct we're using the NORMAL now. In the v2f we call the "position"  variable "pos", because later on a TRANSFER_SHADOW function requires that specific name within the struct. It won't recognise "vertex". There's also a SHADOW_COORDS variable, to tell the shadow where to go. In our vert function we convert the vertex to screenspace, calculate its worldspace normal, then get the dot product between the normal & the light's positi...

Compute Shaders , flock with instanced meshes + frag/vert

Instead of getting data back from the GPU buffer, we're gonna use this code- Graphics.DrawMeshInstancedIndirect(boidMesh, 0, boidMaterial, bounds, argsBuffer, 0); - to draw instances of a mesh. Note the argsBuffer - this is a new type of buffer, containing arguments. It is defined  in the C# script and we're only initialising the first 2 entries of the array that we fill the argsBuffer with. It can actually hold a lot more information, but for this example, we only provide it with the index of the mesh and the number of them we want to draw. Argument Buffers seem to be very specific to this type of instanced mesh drawing in Unity. It doesn't seem easy to find information about them...  The compute shader and frag/vert shader are not directly concerned with the argument buffer, it seems that the code above is the only actual reference to the argsBuffer! The compute shader only updates the position/direction of the boids. The frag/vert shader (which is actually a surface shad...

Compute Shaders, particle system with quads, using Frag/Vert shader to draw

This is more or less the same as the last wall of code. However this time, we are using the Graphics draw to draw a quad, which consists of Four points, Two triangles. We need to adjust our buffer and create a new struct for this, as well as initialise our arrays in a specific way to ensure the quads are facing the correct direction.   As mentioned, our new Struct - Vertex stores a position, UV value and life. We're going to use the Compute Shader to calculate the postion of each particle-  but then draw a quad at that position. We set the specific values for the vertices in the compute shader too. (the UVs don't change for each quad, so we set that value in the C# script)     //Triangle 1 - bot left, top left, top right     vertexBuffer[index].position.x = p.position.x - halfSize;     vertexBuffer[index].position.y = p.position.y - halfSize;     vertexBuffer[index].position.z = p.position.z;     vertex...

Compute Shaders, a basic particle system drawn with a vertex fragment shader

A post for reference really... Not a lot of magic here - except we're using a Graphic procedural - a point- to draw particles. To do this we calculate the position of the particles in the compute shader, then a standard vert-frag shader assigned to a material reads from the gpu buffer and draws the points accordingly. We're also getting the mouse position on screen converted to World Space & having the particles follow that coordinate.  The C# is as follows, careful with the comments i've left...blogger's formatting is a bit wonky - using System.Collections; using System.Collections.Generic; using UnityEngine; #pragma warning disable 0649 public class particle_dave : MonoBehaviour {     private Vector2 cursorPos;     // struct of a particle, fairly simple attributes     struct Particle     {         public Vector3 position;         public Vector3 vel...

Compute Shader, read from calculated buffer

 Use case - get the GPU to calculate a bunch of X,Y,Z coordinates for an arbitrary number of instanced prefabs in Unity.  On the Unity C# side, we need : To specify a compute shader, a handle for the compute shader, a buffer that we'll be writing to in the shader, a prefab, the number of prefabs we want, an array for the instanced prefabs, an array for the coordinate data. Any extra variables we want to pass over, eg time. On the compute shader side we need : To specify a read-write buffer, instead of a texture that we created in the previous post. The buffer will be of float3 type, which is how HLSL calls a vector3. That's actually it... Here's the Compute Shader code - #pragma kernel boxMove RWStructuredBuffer<float3> yesBlad; float time; [numthreads(64,1,1)] void boxMove (uint3 id : SV_DispatchThreadID) {     float xpos = (float)id.x;     float ypos = sin(id.x+time)*50;     float zpos = cos(id.x +time)* 50;    ...

Compute Shader, absolute minimum

 First make a compute shader - right click in the project window-  create shader, compute shader The default code will look as follows // Each #kernel tells which function to compile; you can have many kernels #pragma kernel CSMain // Create a RenderTexture with enableRandomWrite flag and set it // with cs.SetTexture RWTexture2D<float4> Result; [numthreads(8,8,1)] void CSMain (uint3 id : SV_DispatchThreadID) {     // TODO: insert actual code here!     Result[id.xy] = float4(id.x & id.y, (id.x & 15)/15.0, (id.y & 15)/15.0, 0.0); }   To keep things simple, lets just replace that last Result... line with    Result[id.xy]=float4(1,1,0,0);  This will make our shader produce a yellow colour. Note the #pragma kernel is called CSMain. This is basically the function name we'll be calling from the C# script.   To go with the compute shader, we need a C# script that assigns the shader to our geometry. Let's use a Qu...

Legacy / Built-in Renderer Post Processing, Tilt Brush & Unity

 The Tiltbrush export from the Oculus Quest comes as a .glb file. This can be opened in Blender & also Houdini - but they won't really look like much without the Unity shaders. So... find the latest Unity Tiltbrush package online somewhere, install it in the latest Unity - at time of writing, I used Unity 2020.1.2f1 with Tiltbrush Unity Package 23.0.1. At some point Unity or Google will probably stop supporting this! Once you've installed the package, you should be able to import your glb file & it will display correctly in Unity! Hooray! What's that? Some of it is displaying as bright pink? Oh... did you make a URP or HDRP project? Bad news. Take that fancy stuff outta here. Tiltbrush shaders only work with the legacy/ built in renderer.  What's that? You want to make it nice and post processed though? Oh ok.. You can still use the old post processing layer system. Here's how - In your Project window/Assets area, Right Click and make a new Post Processing P...

Procedural line in shader

 A quick shader graph tidbit, lifted from https://www.codinblack.com/the-big-shader-graph-tutorial-second-part/ How to put a tweakable line on an object. Naturally this doesn't quite work with the VFX Graph.. As the position analyses per-object - ie, the whole VFX Graph, instead of per particle.