Rendering pipeline UE5

A lot of people have been trying to figure out the Unreal Engine 5 rendering system over the past few years. I dare say that not everyone who tried has made any significant progress in this matter. The reasons are obvious, it's a huge amount of information. The rendering system is in the process of improvement. Changes in each version are part of the problem. In addition to everything, the renderer has an incredible number of settings. These settings change the order and number of internal processes. For those who haven't given up and are still trying to dig deeper. I've prepared a schema that shows the sequence of the rendering dependency graph execution on the GPU. There is a fundamental structure that is always present. In this particular case, you can see this structure.


Deferred rendering always starts with PrePass. At this stage, the vertex shader draws the geometry and calculates the depth of the scene. In case of using nanite geometry, a NaniteVisBuffer appears. It is this stage in the render pipeline that has attracted an incredible amount of attention from specialists and fans of computer graphics. The nanite surface is drawn in it. After receiving the Depth of the nanite geometry, it is merged with the Depth calculated earlier.
The next important stage is LumenSceneUpdate. During this stage, the engine gathers data about the scene and updates Lumen’s internal representation of the scene (such as the distance field, voxel structure, surface cache, etc.)
The next stage, the importance of which is difficult to overestimate, is BasePass. This is where the pixel shaders work. The result of this work is written to the Gbuffer.
The DiffuseIndirectAndAO stage adds soft global illumination and subtle shadows that greatly improve the realism of the scene.
The ShadowDepth stage does object culling first. Only objects that cast shadows and are visible to the light are rendered. Then the scene is divided into multiple VSM pages that are updated as needed. Finally, shadows from dynamic objects are multiplying by them.
Lights stage calculates how scene lighting interacts with surfaces. Direct Lighting computes the impact of light sources like the sun or lamps on visible geometry. Standard Deferred lighting then combines this with material data (normals, roughness, etc.) in screen space to produce the final lit image, efficiently handling multiple dynamic lights using a deferred rendering approach.
SingleLayerWater pass renders realistic water surfaces using a physically-based model. It simulates reflections, refractions, and light absorption in a single unified layer, allowing for natural interaction with lighting and the environment. This pass is integrated into the lighting and post-processing pipeline to ensure the water blends seamlessly with the rest of the scene.
The SkyAtmosphere simulates scattering of sunlight through the atmosphere for realistic skies and horizons. Exponential Height Fog adds ground-level fog that fades with height, enhancing depth and mood. Volumetric Cloud Compose Over Scene integrates 3D clouds into the final image, blending them naturally over the lit environment to maintain depth, softness, and lighting consistency.
The Translucency phase handles rendering of transparent and semi-transparent materials. It’s divided into several passes depending on their interaction with other effects like distortion, depth of field, or motion blur. These passes are rendered in parallel for efficiency, and are carefully ordered to blend correctly with the opaque scene.
PostProcessing stage refines the final image by simulating how cameras and the human eye perceive the world. It enhances realism and visual quality through effects like dynamic exposure, motion-based blurring, depth-based focus, and light-based artifacts. This phase also includes intelligent upscaling and temporal data blending to improve sharpness and stability across frames, ultimately delivering a polished and cinematic visual output.
At the end the picture is saved in BackBuffer on top of which is a drawn UI.

The diagram shows far from all the processes, and the description includes even fewer. It's impossible to fit comprehensive material within a single article. I can only hope that this dive was useful for you.
I also want to thank Yevhen Kostenko for being the one who inspired me to dive deeper into rendering.