Today, we’ll dive into vertex shaders. Previously, we explored the Metal Shading Language, the rendering pipeline and its setup, so now our focus shifts specifically to vertex and fragment shaders. The examples will remain simple and concise, as different tasks often require unique solutions. To isolate shader concepts, I’ll be using the KodeLife shader editor, which lets us experiment with shaders independently from other pipeline stages.
Now, we can set aside the details of the entire rendering pipeline and concentrate solely on these three steps:

Building the entire Metal infrastructure just for shader development can feel redundant, so let’s use a shader-editing tool instead:

This setup is sufficient for our current purpose. Now, let’s get prepared:
Renderer section (3), set it to Metal, and you should see something like the screen above (1).primitive to Sphere, the type to LINE_STRIP, and the instance count to, say, 5.Projection to Perspective.
Let’s begin with the “default” shader provided by KodeLife. While a real-world shader might look quite different, this starting point is effective and works well for most simple tasks.
NOTE: I’ve made some slight adjustments to the original KodeLife naming.
#include <metal_stdlib> // (1)
using namespace metal; // (2)
struct VertexInput { // (3)
float4 position [[attribute(0)]];
float3 normal [[attribute(1)]];
float2 texcoord [[attribute(2)]];
};
struct VertexOutput { // (4)
float4 position [[position]];
float3 normal;
float2 texCoord;
};
struct Parameters { // (5)
float time;
float2 resolution;
float2 mouse;
float3 spectrum;
float4x4 mvp;
};
vertex VertexOutput vs_main( // (6)
VertexInput input [[stage_in]], // (7)
constant Parameters& uniform [[buffer(16)]]) { // (8)
VertexOutput out;
out.position = uniform.mvp * input.position; // (9)
out.normal = input.normal;
out.texCoord = input.texcoord;
return out;
}
metal:: explicitly, you can skip this step.[[position]], representing the vertex’s position in the viewport. Without this, rendering won’t work, as there would be no displayable position.time, output resolution, mouse position and status, spectrum (just low, mid, high), and mvp - the Model, View, Projection matrix. This is actually Projection * View * Model, used to calculate vertices’ final positions in the viewport.vertex function. It returns VertexOutput—this same structure will serve as [[stage_in]] for the fragment shader, with interpolated values. The function name can vary, but KodeLife uses vs_main.VertexInput attributes from the corresponding buffers. For more details, refer to the previous episode.#include <metal_stdlib>
using namespace metal;
struct FragmentInput { // (1)
float3 normal;
float2 texCoord;
};
struct Parameters { // (2)
float time;
float2 resolution;
float2 mouse;
float3 spectrum;
};
fragment float4 fs_main( // (3)
FragmentInput In [[stage_in]], // (4)
constant Parameters& params [[buffer(16)]]) {
float2 uv = -1. + 2. * In.texCoord;
float4 col = float4(
abs(sin(cos(params.time+3.*uv.y)*2.*uv.x+params.time)),
abs(cos(sin(params.time+2.*uv.x)*3.*uv.y+params.time)),
params.spectrum.x * 100.,
1.0);
return col; // (5)
}
[[position]] parameter is omitted here, but you can include it if needed.fragment function. It returns a float4 since there’s only one color attachment. If needed, you can define a structure with multiple attachments like [[color(0)]], [[color(1)]], [[depth(0)]], etc.Having five spheres in one spot is obviously quite dull. Let’s enhance the composition by moving them around the viewport and adding some animation.
Since we have 5 instances, there’s no need to add additional models; we can simply reuse the geometry of the original. This can be achieved using an instance index. Let’s add a few lines to our function to make the composition more dynamic:
vertex VertexOutput vs_main(
unsigned int vid [[vertex_id]], // (1)
unsigned int iid [[instance_id]], // (2)
VertexInput input [[stage_in]],
constant Parameters& params [[buffer(16)]]
) {
VertexOutput out;
float4 position = input.position; // (3)
float wave = 1.0 + sin(vid * 0.01 + params.time + iid) * 0.1;
position.xyz *= wave; // (4)
float a = iid;
position = float4x4(
float4(cos(a), sin(a), 0, 0),
float4(-sin(a), cos(a), 0, 0),
float4(0, 0, 1, 0),
float4(0, 0, 0, 1)
) * position; // (5)
float3 offset = float3(iid - 3.0, iid % 3 - 1.0, iid * 0.4);
position.xyz += offset; // (6)
out.position = params.mvp * position; // (7)
out.normal = input.normal;
out.texCoord = float2(iid, vid); // (8)
return out;
}
z-coordinate, but using vertex indices can be helpful, depending on the context.texCoord. Ideally, this field should be renamed.
It’s still a bit dull, even though the shapes have become more interesting. Now, let’s focus on improving the coloring:
struct FragmentInput {
float4 position [[position]]; // (1)
float3 normal;
float2 texCoord;
};
fragment float4 fs_main(
FragmentInput In [[stage_in]],
constant Parameters& params [[buffer(16)]]) {
// (2)
float wave = sin(5 * fract(In.texCoord.y * 0.02 + params.time + In.texCoord.x));
wave = 1.0 - smoothstep(0.0, 0.3, wave);
// (3)
float3 lightPos = float3(1000.0, 100.0, 1.0);
float3 lightDir = normalize(lightPos - In.position.xyz);
float lightness = clamp(dot(lightDir, In.normal), 0.0, 1.0);
lightness = mix(0.2, 2.0, lightness);
// (4)
return float4(In.texCoord.x * 0.2, wave, In.texCoord.y * 0.0005, 1.0) * lightness;
}
mix to set the minimum and maximum lightness values.texCoord), the spiral wave, and the calculated lightness.
