Three.js Shaders: GLSL, ShaderMaterial & TSL (r184)
Shaders are small programs that run on the GPU — one per vertex, one per pixel — and they are what make Three.js feel like a medium instead of a library. This guide covers ShaderMaterial (classic GLSL), the anatomy of vertex + fragment shaders, how uniforms and varyings move data, and Three's 2026 TSL node graph — a JavaScript-native alternative that compiles to GLSL for WebGL and WGSL for WebGPU from one source.
What is a Shader?
A shader is a program the GPU runs in parallel. Three.js pipelines every rendered object through two shader stages:
- The vertex shader runs once per vertex. Its job is to output
gl_Position— where the vertex lands on the screen — and optionally pass data to the fragment stage viavaryings. - The fragment shader runs once per pixel the triangle covers. Its job is to output
gl_FragColor— the color of that pixel.
Between them, the GPU rasterizes the triangle: it figures out which pixels are inside, and for each one it interpolates the varyings across the three vertices. That interpolation is why a gradient between three corner colors looks smooth without you writing any math for the middle.
In the live demo below, a sphere is rendered by a ShaderMaterial with a trivial pair of shaders — the vertex stage just projects each vertex, the fragment stage outputs a flat purple. Everything you see is the rasterizer doing its job.
ShaderMaterial Anatomy
ShaderMaterial accepts three parts: a vertex shader source, a fragment shader source, and a uniforms dictionary that lets you pass JavaScript values into both shaders at render time.
Three.js injects the camera matrices and the attribute declarations for position, normal, and uv for you. If you need rawer control — or you want to target only WebGL2 features — reach for RawShaderMaterial instead, which injects nothing.
The minimal ShaderMaterial is the "hello triangle" of Three.js shaders:
const mat = new THREE.ShaderMaterial({
uniforms: {
u_color: { value: new THREE.Color(0x8a6dff) },
},
vertexShader: `
void main() {
gl_Position = projectionMatrix
* modelViewMatrix
* vec4(position, 1.0);
}
`,
fragmentShader: `
uniform vec3 u_color;
void main() {
gl_FragColor = vec4(u_color, 1.0);
}
`,
});
const mesh = new THREE.Mesh(new THREE.SphereGeometry(1, 32, 16), mat);
Vertex Shader — Moving Vertices
The vertex shader has access to the vertex's position, normal, and uv attributes, plus the standard matrices: modelMatrix, viewMatrix, projectionMatrix, and the pre-multiplied modelViewMatrix.
Its final line must assign gl_Position — that is what puts the vertex on screen. Everything else is setup.
A common use is vertex displacement — moving the vertex along its normal by some procedural amount. The demo below does exactly this: a plane with 64×64 segments is displaced by a sin(x*freq + time) * amp wave before projection. The fragment shader colors the result by the same wave value, passed across via a varying.
uniform float u_time;
uniform float u_freq;
uniform float u_amp;
varying float vWave;
void main() {
float wave = sin(position.x * u_freq + u_time);
vec3 displaced = position + normal * wave * u_amp;
vWave = wave;
gl_Position = projectionMatrix
* modelViewMatrix
* vec4(displaced, 1.0);
}
Fragment Shader — Coloring Pixels
The fragment shader runs per pixel. It receives whatever the vertex shader wrote into varyings — already interpolated — plus any uniforms, and produces a single output: gl_FragColor, a vec4(r, g, b, a).
Because uniforms are shared across both shader stages, you can drive colors from JavaScript state. The demo below shows a linear gradient between two colors along a configurable angle — classic background material for hero sections.
Writing procedural colors in the fragment shader is the bread and butter of creative shader work: gradients, noise, dissolve effects, stripes, fire, water. Every pattern you can describe mathematically from a UV coordinate and a time uniform, you can render in a fragment shader.
uniform vec3 u_color1;
uniform vec3 u_color2;
varying vec2 vUv;
void main() {
// angle-driven gradient across UV space
vec2 dir = normalize(vec2(1.0, 0.7));
float t = dot(vUv - 0.5, dir) + 0.5;
vec3 col = mix(u_color1, u_color2, clamp(t, 0.0, 1.0));
gl_FragColor = vec4(col, 1.0);
}
Uniforms & Varyings — Moving Data Around
Uniforms flow from JavaScript into both shader stages. You declare them in the ShaderMaterial uniforms dictionary, read them in GLSL with uniform <type> name;, and update them each frame by assigning to material.uniforms.name.value. No needsUpdate flag required — assigning .value is enough.
Varyings flow from the vertex shader into the fragment shader. Each vertex writes a value; the rasterizer interpolates it across the triangle; the fragment shader reads the smoothed result. This is how you get smooth color, normals, or world positions without writing any interpolation code yourself.
The demo shows an icosahedron whose vertex shader passes the world-space Y coordinate through a varying. The fragment shader uses it to blend two colors — cool at the bottom, warm at the top.
// Vertex shader — write the varying
varying vec3 vWorldPos;
void main() {
vWorldPos = (modelMatrix * vec4(position, 1.0)).xyz;
gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
}
// Fragment shader — read the interpolated varying
varying vec3 vWorldPos;
uniform vec3 u_bottom;
uniform vec3 u_top;
void main() {
float t = smoothstep(-1.0, 1.0, vWorldPos.y);
gl_FragColor = vec4(mix(u_bottom, u_top, t), 1.0);
}
// JS: update a uniform each frame
material.uniforms.u_time.value += clock.getDelta();
TSL — The 2026 Alternative (r184)
Three.js r184 ships TSL (Three Shading Language) as a first-class shading API. Instead of writing GLSL strings, you compose a node graph in plain JavaScript and assign it to material.colorNode or material.positionNode. The renderer lowers the graph to GLSL for WebGLRenderer and to WGSL for WebGPURenderer at compile time — one source, two backends.
TSL is just JavaScript: time, positionLocal, mix, sin, mx_noise_float are all imports from three/tsl. There are no shader strings to typo, no #include preprocessor, and no duplicated GLSL+WGSL sources to maintain.
A minimal TSL version of a displaced-wave material looks like this:
See the live implementations: the Shaders SPA demo has a tslNoise preset that runs on WebGL via WebGLNodesHandler; the WebGPU & TSL flagship demo has four TSL node graphs, a GPU compute particle system, and a side-by-side GLSL→TSL migration viewer.
When to pick which. Reach for classic ShaderMaterial when you already have GLSL you want to port, you need exotic WebGL2 features, or the team's comfort level is higher with GLSL. Reach for TSL / NodeMaterial when you want WebGPU support, a refactorable JS-native graph, or the built-in PBR lighting pipeline via MeshStandardNodeMaterial.
import { uniform, time, sin, positionLocal, normalLocal, vec3 } from 'three/tsl';
import { MeshStandardNodeMaterial } from 'three/webgpu';
const uAmp = uniform(0.25);
const uFreq = uniform(3.0);
// time, positionLocal, normalLocal are built-in TSL nodes
const wave = sin(positionLocal.x.mul(uFreq).add(time));
const displaced = positionLocal.add(normalLocal.mul(wave.mul(uAmp)));
const mat = new MeshStandardNodeMaterial();
mat.positionNode = displaced;
mat.colorNode = vec3(0.3, 0.4, 0.9).add(wave.mul(0.5));
// The same graph compiles to GLSL on WebGLRenderer
// and to WGSL on WebGPURenderer — no duplication.
Quick Reference: Common GLSL Built-ins and TSL Node Equivalents
| Name | Stage | What it is |
|---|---|---|
| gl_Position | Vertex (out) |
Final clip-space position. Required output of every vertex shader. |
| gl_FragColor | Fragment (out) |
Final RGBA color for the pixel. Required output of WebGL1 fragment shaders. |
| position | Vertex (attr) |
Per-vertex local-space position. Injected by ShaderMaterial. |
| normal | Vertex (attr) |
Per-vertex normal vector. Used for lighting and displacement direction. |
| uv | Vertex (attr) |
Per-vertex texture coordinates. Passed to fragment via a varying. |
| modelMatrix | Uniform |
Transforms local → world space. Injected by Three.js. |
| viewMatrix | Uniform |
Transforms world → camera space. Injected by Three.js. |
| projectionMatrix | Uniform |
Transforms camera → clip space. Injected by Three.js. |
| modelViewMatrix | Uniform |
Pre-multiplied modelMatrix * viewMatrix for efficiency. |
| normalMatrix | Uniform |
Inverse transpose of modelViewMatrix. Used to transform normals correctly under non-uniform scaling. |
| cameraPosition | Uniform |
World-space camera position. Useful for view-dependent effects like fresnel. |
| time | TSL node |
Built-in TSL node. Automatically advances each frame — no manual uniform needed. |
| positionLocal | TSL node |
Per-vertex local-space position in TSL (equivalent to GLSL `position`). |
| positionWorld | TSL node |
Per-vertex world-space position in TSL. Use for IBL, fog, fresnel. |
| normalLocal | TSL node |
Per-vertex local-space normal in TSL (equivalent to GLSL `normal`). |
| colorNode | TSL material |
Assign to any NodeMaterial to replace its fragment color with a TSL graph. |
| positionNode | TSL material |
Assign to any NodeMaterial to replace vertex displacement with a TSL graph. |
| emissiveNode | TSL material |
TSL graph for self-illuminated color on MeshStandardNodeMaterial. No texture required. |
Tips & Best Practices
Shader rendering black? Check these three things.
First, make sure the fragment shader actually writes to gl_FragColor (or out vec4 outColor in WebGL2 / RawShaderMaterial). Second, if you're using a lit shader chunk, ensure the scene has at least one light and the geometry has valid normals (geometry.computeVertexNormals()). Third, look at the browser console — GLSL compile errors show up as red text with the exact line and column of the issue.
Reuse one ShaderMaterial across meshes with identical shading.
Each ShaderMaterial instance triggers a fresh GLSL compile the first time it's rendered. If ten meshes all need the same procedural surface, give them the same material instance. Only clone or create a new one when you actually need a different uniform value or a different shader source.
Uniforms update by assigning .value — no needsUpdate flag.
Unlike geometry attributes (which need attribute.needsUpdate = true after mutation), uniforms propagate automatically. Just write material.uniforms.u_time.value = clock.getElapsedTime() each frame. The needsUpdate dance is only for attribute and texture data that's been re-uploaded to the GPU.
Pass UVs via a varying, not a uniform.
UVs are per-vertex data. Declare varying vec2 vUv; in both shaders, write vUv = uv; in the vertex shader, and the interpolated value is available in the fragment shader as vUv. This is the standard pattern for texture mapping, gradient effects, and any UV-driven procedural color.
Migrating to TSL? Write the graph once, ship to both backends.
A TSL node graph assigned to material.colorNode or material.positionNode compiles to GLSL for WebGLRenderer and to WGSL for WebGPURenderer. Set glRenderer.setNodesHandler(new WebGLNodesHandler()) to enable the WebGL path. One source, two backends — no duplicated shader code.
Try Shaders Interactively
Explore shaders with live 3D demos — adjust parameters in real-time and see the results instantly.
Open Interactive DemoFrequently Asked Questions
What is the difference between ShaderMaterial and RawShaderMaterial in Three.js?
ShaderMaterial injects Three.js-managed uniforms (modelMatrix, projectionMatrix, etc.) and the standard attribute declarations (position, normal, uv) into your GLSL automatically. RawShaderMaterial injects nothing — you declare every uniform and attribute yourself and pick your own GLSL version (e.g., #version 300 es). Use ShaderMaterial for 95% of cases; use RawShaderMaterial when you need exotic WebGL2 features or maximum control over the shader preamble.
What is TSL in Three.js and when should I use it instead of ShaderMaterial?
TSL — Three Shading Language — is a JavaScript node-graph API introduced as stable in r184. You compose shading logic from nodes like positionLocal, time, mix, and mx_noise_float, then assign the graph to material.colorNode or positionNode. The renderer compiles it to GLSL for WebGL and to WGSL for WebGPU. Use TSL when you want WebGPU support, a refactorable JS-native source, or the built-in PBR lighting pipeline via MeshStandardNodeMaterial. Stick with ShaderMaterial for existing GLSL codebases or WebGL-only projects where the team is already fluent in GLSL.
How do I pass a texture to a Three.js shader?
Load the texture with a TextureLoader and add it to your uniforms as a sampler2D: uniforms: { u_map: { value: loadedTexture } }. In GLSL declare uniform sampler2D u_map; and sample with texture2D(u_map, vUv) (WebGL1) or texture(u_map, vUv) (WebGL2 / GLSL 3.00). For TSL, use the texture() node from three/tsl — same graph runs on WebGL and WebGPU.
Why is my Three.js shader rendering completely black?
Three common causes. (1) A GLSL compile error — check the browser console for red error text with line numbers. (2) The fragment shader isn't writing gl_FragColor, or is writing it as vec4(0.0). (3) If you used Three.js' built-in lighting chunks, the geometry needs valid normals (geometry.computeVertexNormals()) and the scene needs at least one light. If all three check out, try outputting gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0); to confirm the shader runs at all.
How do I animate a Three.js shader over time?
Add a u_time uniform to your material and update it each frame: material.uniforms.u_time.value = clock.getElapsedTime() using a THREE.Clock. In GLSL read uniform float u_time; and use it in sin(), offsets, or mix amounts. In TSL, time is a built-in node — no manual uniform or per-frame update is needed; the renderer advances it for you.