Introduction: The Art of Real-Time Rendering
Series Overview: This is Part 9 of our 16-part Unity Game Engine Series. We explore the rendering pipeline — the system that transforms 3D geometry, materials, and lighting data into the pixels you see on screen, running 30-144 times per second.
1
Unity Basics & Interface
Editor overview, assets, prefabs, architecture
2
C# Scripting Fundamentals
MonoBehaviour, coroutines, input systems, patterns
3
GameObjects & Components
Transforms, renderers, custom components
4
Physics & Collisions
Rigidbody, colliders, raycasting, forces
5
UI Systems
Canvas, uGUI, UI Toolkit, responsive design
6
Animation & State Machines
Animator, blend trees, IK, Timeline
7
Audio & Visual Effects
AudioSource, particles, VFX Graph, post-processing
8
Building & Publishing
Build pipeline, optimization, platforms, monetization
9
Rendering Pipelines
URP, HDRP, Shader Graph, lighting systems
You Are Here
10
Data-Oriented Tech Stack
ECS, Jobs System, Burst Compiler
11
AI & Gameplay Systems
NavMesh, FSMs, behavior trees, procedural gen
12
Multiplayer & Networking
Netcode, RPCs, latency, prediction
13
Tools & Editor Scripting
Custom editors, debug tools, CI/CD
14
Architecture & Clean Code
Service locators, DI, ScriptableObject architecture
15
Performance Optimization
CPU/GPU profiling, memory, object pooling
16
Production & Industry Practices
Git, Agile, asset pipelines, debugging at scale
The rendering pipeline is the backbone of everything visual in your game. It determines how light interacts with surfaces, how shadows are cast, how post-processing effects are applied, and ultimately how many frames per second your game achieves. Unity's rendering architecture has undergone a revolutionary transformation with the introduction of the Scriptable Render Pipeline (SRP), giving developers unprecedented control over the rendering process.
Whether you're building a stylized mobile game that needs to run at 60fps on a $200 phone, or a photorealistic architectural visualization that leverages ray tracing on an RTX 4090, understanding rendering pipelines is the key to achieving your visual and performance targets.
Key Insight: Choosing the right render pipeline at project start is critical — migrating from one pipeline to another mid-project requires converting every material, shader, and post-processing effect. This can take weeks for a large project. Make this decision during pre-production, not mid-development.
1. Scriptable Render Pipeline (SRP)
1.1 Built-in vs SRP
Before 2018, Unity had a single, monolithic rendering pipeline — the Built-in Render Pipeline. It was a "one size fits all" solution that tried to serve every use case but excelled at none. The SRP revolution changed this:
| Aspect |
Built-in Pipeline |
Scriptable Render Pipeline |
| Architecture |
Monolithic C++ code, black box |
C# scriptable, open source, customizable |
| Rendering Paths |
Forward and Deferred (fixed implementations) |
Any path — Forward, Forward+, Deferred, custom |
| Shader Model |
Legacy Surface Shaders, fixed function support |
Modern shader architecture, Shader Graph |
| Batching |
Static, Dynamic batching |
SRP Batcher (dramatically faster), GPU instancing |
| Customization |
Limited — command buffers, OnRenderImage |
Full control — write custom render passes, features |
| Status |
Maintenance mode — no new features |
Active development — all new features here |
1.2 URP vs HDRP Decision Matrix
| Criteria |
URP |
HDRP |
| Target Hardware |
Mobile, VR, Switch, low-mid PCs, WebGL |
High-end PCs, PS5/Xbox Series X, workstations |
| Visual Fidelity |
Good — stylized to semi-realistic |
Outstanding — photorealistic, film quality |
| Performance Budget |
Tight — every ms counts |
Generous — can afford expensive effects |
| Ray Tracing |
Not supported |
Full support — RT reflections, GI, shadows, AO |
| Volumetric Effects |
Basic fog |
Volumetric fog, clouds, light shafts |
| 2D Support |
Excellent — dedicated 2D Renderer |
Not designed for 2D |
| Learning Curve |
Moderate — good documentation, widespread use |
Steep — complex settings, specialized knowledge |
| Ideal Projects |
Mobile games, indie titles, VR, 2D games |
AAA games, arch-viz, automotive, film |
Decision Rule: If you're unsure, choose URP. It covers 90%+ of game projects, runs on all platforms, and has the largest community and asset store support. Only choose HDRP if you specifically need photorealistic fidelity and your target hardware can handle it. Starting with HDRP for a mobile game is a costly mistake.
2. Universal Render Pipeline (URP)
2.1 URP Setup & Renderer Features
URP is Unity's go-to pipeline for the vast majority of projects. It provides excellent visual quality while maintaining the performance headroom needed for mobile, VR, and cross-platform titles.
// Custom URP Renderer Feature — add a full-screen effect
using UnityEngine;
using UnityEngine.Rendering;
using UnityEngine.Rendering.Universal;
public class GrayscaleRendererFeature : ScriptableRendererFeature
{
[System.Serializable]
public class Settings
{
public RenderPassEvent renderPassEvent = RenderPassEvent.AfterRenderingPostProcessing;
public Material grayscaleMaterial;
[Range(0f, 1f)] public float intensity = 1f;
}
public Settings settings = new Settings();
private GrayscaleRenderPass renderPass;
public override void Create()
{
renderPass = new GrayscaleRenderPass(settings);
}
public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
{
if (settings.grayscaleMaterial == null) return;
renderer.EnqueuePass(renderPass);
}
class GrayscaleRenderPass : ScriptableRenderPass
{
private Settings settings;
private RTHandle tempTexture;
public GrayscaleRenderPass(Settings settings)
{
this.settings = settings;
this.renderPassEvent = settings.renderPassEvent;
}
public override void Execute(ScriptableRenderContext context, ref RenderingData renderingData)
{
CommandBuffer cmd = CommandBufferPool.Get("GrayscaleEffect");
var cameraColorTarget = renderingData.cameraData.renderer.cameraColorTargetHandle;
settings.grayscaleMaterial.SetFloat("_Intensity", settings.intensity);
// Blit: copy screen through grayscale shader
Blit(cmd, cameraColorTarget, tempTexture, settings.grayscaleMaterial);
Blit(cmd, tempTexture, cameraColorTarget);
context.ExecuteCommandBuffer(cmd);
CommandBufferPool.Release(cmd);
}
}
}
2.2 Forward & Forward+ Rendering
URP supports multiple rendering paths that determine how lights interact with objects:
| Rendering Path |
How It Works |
Light Limit |
Best For |
| Forward |
Each object rendered once per light affecting it |
1 main + 8 additional per object (configurable) |
Mobile, VR, simple lighting scenarios |
| Forward+ |
Tile-based light culling — lights assigned to screen tiles |
Hundreds of real-time lights |
Indoor scenes with many lights, RPGs, horror games |
| Deferred (URP) |
G-Buffer pass stores surface data, then lights applied |
Unlimited real-time lights |
Complex indoor environments, many dynamic lights |
Pro Tip: Forward+ is the sweet spot for most URP projects targeting PC. It handles hundreds of lights with minimal overhead compared to traditional forward rendering, without the G-Buffer memory cost of deferred. Enable it in your URP Renderer Asset under Rendering Path.
3. High Definition Render Pipeline (HDRP)
3.1 Physically-Based Lighting & Ray Tracing
HDRP is designed for projects where visual fidelity is paramount. It uses physically accurate lighting models, HDR color space, and supports hardware-accelerated ray tracing for photorealistic rendering:
| HDRP Feature |
Description |
Performance Cost |
| Physical Light Units |
Lights measured in lux, lumens, candela — real-world units |
None (same rendering cost, better authoring) |
| Volumetric Fog |
Light scattering through atmospheric density |
Medium — raymarching in 3D volume texture |
| Volumetric Clouds |
Physically-based cloud rendering with density maps |
High — GPU-intensive raymarching |
| RT Reflections |
Accurate reflections using ray tracing hardware |
High — requires RTX GPU, 2-4ms per frame |
| RT Global Illumination |
Dynamic indirect lighting computed via ray tracing |
Very High — 3-6ms, but eliminates baking |
| RT Ambient Occlusion |
Accurate contact shadows in crevices |
Medium — 1-2ms, great quality improvement |
| Subsurface Scattering |
Light passing through translucent materials (skin, wax, leaves) |
Medium — additional render passes for SSS materials |
3.2 Beyond Gaming: Film, Automotive & Arch-Viz
HDRP has found significant adoption outside traditional game development:
Case Study
"Enemies" — Unity's HDRP Tech Demo
Unity's "Enemies" real-time cinematic demo showcased HDRP's ability to render photorealistic human characters in real-time. Key technologies demonstrated:
- Digital Human package — skin rendering with 4-layer subsurface scattering (epidermis, dermis, subcutaneous, deep tissue)
- Strand-based hair — individual hair strands rendered and lit (not texture cards), with physics simulation
- Eye rendering — physically accurate eye model with caustics, iris detail, and refraction through cornea
- Ray-traced reflections and GI — accurate environmental reflections on skin and eyes, dynamic indirect lighting
The demo ran in real-time at 30fps on an RTX 3080, demonstrating that game engines can now compete with offline renderers for digital human work.
Photorealistic
Digital Humans
Ray Tracing
Real-Time
Case Study
"Book of the Dead" — Environment Showcase
Unity's "Book of the Dead" demo pushed the boundaries of real-time environment rendering. Using HDRP, the team achieved near-photorealistic forest environments with:
- Photogrammetry-scanned assets (trees, rocks, ground) with 8K textures
- Volumetric lighting through forest canopy with time-of-day transitions
- Wind simulation affecting vegetation with physically-based deformation
- Dynamic weather systems with real-time puddle accumulation and wet surface shaders
Photogrammetry
Environment Art
Volumetric Lighting
4. Shader Graph
4.1 Visual Shader Editor
Shader Graph is Unity's node-based visual shader editor. It allows you to create complex shader effects by connecting nodes in a graph, without writing a single line of HLSL code. Think of it as visual programming for GPU effects.
| Node Category |
Key Nodes |
Use Cases |
| Math |
Add, Multiply, Lerp, Remap, Step, Smoothstep, Power |
Blend values, create gradients, threshold effects |
| UV |
Tiling and Offset, Rotate, Polar Coordinates, Twirl, Spherize |
Texture animation, distortion effects, UV manipulation |
| Texture |
Sample Texture 2D, Triplanar, Normal From Texture, Cubemap |
Reading textures, triplanar mapping, normal maps |
| Input |
Time, Position, View Direction, Normal Vector, Screen Position |
Animation, view-dependent effects, screen-space effects |
| Procedural |
Noise (Gradient, Simple, Voronoi), Checkerboard, Shapes |
Procedural textures, organic patterns, masks |
| Custom Function |
Custom Function node — embed HLSL code in a graph |
Complex math not available as nodes, existing HLSL integration |
4.2 Toon, Dissolve & Hologram Shader Recipes
Here are the key concepts behind popular Shader Graph effects. While Shader Graph is visual, understanding the node logic helps you design custom shaders:
// Dissolve shader logic (conceptual — represents Shader Graph node flow)
// In Shader Graph, these would be connected as nodes:
// 1. Sample a noise texture (Gradient Noise or imported texture)
// Input: UV coordinates
// Output: float noise value (0-1)
// 2. Compare noise with a dissolve threshold (exposed property 0-1)
// Step(threshold, noiseValue) — outputs 0 or 1
// This creates the dissolve mask
// 3. Create an edge glow at the dissolve boundary
// Smoothstep(threshold - edgeWidth, threshold, noiseValue)
// Subtract the hard Step result to isolate the edge band
// Multiply by emission color (hot orange/white)
// 4. Apply to material:
// Alpha = Step result (clips dissolved pixels)
// Emission = edge glow (makes the dissolve edge glow)
// Base Color = original texture * Step result
// The equivalent HLSL for a Custom Function node:
/*
void DissolveEffect_float(
float NoiseValue,
float Threshold,
float EdgeWidth,
float3 EdgeColor,
out float Alpha,
out float3 Emission)
{
Alpha = step(Threshold, NoiseValue);
float edge = smoothstep(Threshold - EdgeWidth, Threshold, NoiseValue);
Emission = EdgeColor * (edge - Alpha) * 3.0; // Boost emission
}
*/
// Hologram shader logic
// Combines multiple effects for a sci-fi holographic look:
// 1. Fresnel Effect — bright edges based on view angle
// FresnelEffect(Normal, ViewDir, Power)
// Makes object edges glow brighter (rim light effect)
// 2. Scanlines — horizontal lines scrolling over the surface
// sin(ScreenPosition.y * frequency + Time * speed)
// Step(0.5, sinResult) creates sharp scanline bands
// 3. Vertex displacement — subtle jitter
// Offset vertices along normal by noise * small amount
// Creates holographic "glitch" instability
// 4. Transparency — object is semi-transparent
// Alpha = Fresnel * BaseAlpha + ScanlineAlpha
// Surface type: Transparent, Blend mode: Additive
// 5. Color — single hologram tint (cyan, blue, or green)
// Multiply all outputs by hologram color
Sub-Graphs for Reuse: In Shader Graph, you can create Sub Graphs — reusable shader functions that act like custom nodes. If you find yourself recreating the same node setup (e.g., a dissolve effect, a triplanar blend, a fresnel rim), extract it into a Sub Graph. This is the shader equivalent of a code function — DRY principle applied to visual programming.
5. HLSL Shader Programming
5.1 Vertex & Fragment Shader Structure
While Shader Graph handles most needs, understanding HLSL (High Level Shading Language) is essential for advanced effects, debugging shader issues, and writing custom lighting models. Every shader runs in two stages:
| Stage |
Runs For |
Input |
Output |
| Vertex Shader |
Each vertex in the mesh |
Object-space position, normal, UV, color |
Clip-space position (screen coordinates) |
| Fragment Shader |
Each pixel covered by the mesh |
Interpolated data from vertex shader |
Final pixel color (RGBA) |
// Complete URP-compatible HLSL shader
Shader "Custom/ToonShader"
{
Properties
{
_MainTex ("Texture", 2D) = "white" {}
_BaseColor ("Base Color", Color) = (1, 1, 1, 1)
_ShadowColor ("Shadow Color", Color) = (0.3, 0.3, 0.5, 1)
_ShadowThreshold ("Shadow Threshold", Range(0, 1)) = 0.5
_ShadowSmoothness ("Shadow Smoothness", Range(0, 0.5)) = 0.02
_RimColor ("Rim Color", Color) = (1, 1, 1, 1)
_RimPower ("Rim Power", Range(1, 10)) = 4
_RimThreshold ("Rim Threshold", Range(0, 1)) = 0.5
}
SubShader
{
Tags { "RenderType"="Opaque" "RenderPipeline"="UniversalPipeline" }
Pass
{
Name "ForwardLit"
Tags { "LightMode"="UniversalForward" }
HLSLPROGRAM
#pragma vertex vert
#pragma fragment frag
#pragma multi_compile _ _MAIN_LIGHT_SHADOWS
#pragma multi_compile _ _MAIN_LIGHT_SHADOWS_CASCADE
#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Lighting.hlsl"
// Vertex input structure
struct Attributes
{
float4 positionOS : POSITION; // Object-space position
float3 normalOS : NORMAL; // Object-space normal
float2 uv : TEXCOORD0; // UV coordinates
};
// Vertex-to-fragment interpolated data
struct Varyings
{
float4 positionCS : SV_POSITION; // Clip-space position
float2 uv : TEXCOORD0;
float3 normalWS : TEXCOORD1; // World-space normal
float3 positionWS : TEXCOORD2; // World-space position
float3 viewDirWS : TEXCOORD3; // View direction
};
// Shader properties
TEXTURE2D(_MainTex);
SAMPLER(sampler_MainTex);
CBUFFER_START(UnityPerMaterial)
float4 _MainTex_ST;
float4 _BaseColor;
float4 _ShadowColor;
float _ShadowThreshold;
float _ShadowSmoothness;
float4 _RimColor;
float _RimPower;
float _RimThreshold;
CBUFFER_END
// Vertex shader
Varyings vert(Attributes input)
{
Varyings output;
output.positionCS = TransformObjectToHClip(input.positionOS.xyz);
output.positionWS = TransformObjectToWorld(input.positionOS.xyz);
output.normalWS = TransformObjectToWorldNormal(input.normalOS);
output.uv = TRANSFORM_TEX(input.uv, _MainTex);
output.viewDirWS = GetWorldSpaceNormalizeViewDir(output.positionWS);
return output;
}
// Fragment shader — toon lighting
half4 frag(Varyings input) : SV_Target
{
// Sample base texture
half4 texColor = SAMPLE_TEXTURE2D(_MainTex, sampler_MainTex, input.uv);
// Get main directional light
Light mainLight = GetMainLight();
float3 lightDir = mainLight.direction;
// Toon shading — step function for hard shadow edge
float NdotL = dot(normalize(input.normalWS), lightDir);
float shadow = smoothstep(
_ShadowThreshold - _ShadowSmoothness,
_ShadowThreshold + _ShadowSmoothness,
NdotL
);
// Blend between shadow and lit color
float3 lighting = lerp(_ShadowColor.rgb, float3(1,1,1), shadow);
// Rim lighting (Fresnel)
float rimDot = 1.0 - dot(normalize(input.viewDirWS), normalize(input.normalWS));
float rim = smoothstep(_RimThreshold, 1.0, pow(rimDot, _RimPower));
// Final color
float3 finalColor = texColor.rgb * _BaseColor.rgb * lighting * mainLight.color;
finalColor += rim * _RimColor.rgb;
return half4(finalColor, 1.0);
}
ENDHLSL
}
}
}
5.2 Custom Lighting Models
When to Write HLSL vs Use Shader Graph: Use Shader Graph for 90% of material effects — it's faster to iterate, easier to maintain, and less error-prone. Write HLSL when you need: custom lighting models (toon, NPR), performance-critical shaders with precise control, compute shaders, or effects that require techniques not exposed as Shader Graph nodes.
// Custom lighting function for Shader Graph (Custom Function node)
// Save as .hlsl file and reference in a Custom Function node
// Cel/Toon lighting with configurable bands
void CelShading_float(
float3 Normal,
float3 LightDirection,
float3 LightColor,
float Bands, // Number of shading bands (2 = light/dark, 3 = light/mid/dark)
float Smoothness, // Edge softness between bands
out float3 Output)
{
float NdotL = dot(Normal, LightDirection);
float lightIntensity = (NdotL + 1.0) * 0.5; // Remap -1..1 to 0..1
// Quantize to discrete bands
lightIntensity = floor(lightIntensity * Bands) / Bands;
// Optional: smooth the band transitions
// lightIntensity = smoothstep(0, Smoothness, frac(lightIntensity * Bands));
Output = LightColor * lightIntensity;
}
6. Lighting Systems
6.1 Light Types & Modes
| Light Type |
Shape |
Use Cases |
Performance |
| Directional |
Infinite parallel rays (like the sun) |
Sun/moon, primary scene illumination |
Low — affects everything equally, one shadow cascade |
| Point |
Sphere emitting in all directions |
Torches, lamps, explosions, magic effects |
Medium — shadow maps per face (6 faces = cube map) |
| Spot |
Cone of light from a point |
Flashlights, streetlights, stage lighting, headlights |
Low-Medium — single shadow map, cone-based culling |
| Area (Baked only / HDRP real-time) |
Rectangle or disc emitting from surface |
Windows, screens, soft studio lighting, neon signs |
High if real-time — expensive soft shadows. Free if baked |
Lighting modes determine when lighting is calculated:
| Mode |
When Calculated |
Best For |
Limitations |
| Baked |
Pre-computed in editor, stored in lightmaps |
Static environments — zero runtime cost |
Cannot change at runtime. No dynamic shadows on static objects |
| Real-time |
Every frame on the GPU |
Dynamic objects, moving lights, gameplay-driven lighting |
Expensive. No indirect lighting (bounced light) without probes |
| Mixed |
Indirect lighting baked, direct + shadows real-time |
Best of both worlds — baked GI + real-time shadows |
Complex setup. Multiple shadow modes (Shadowmask, Subtractive) |
6.2 Global Illumination, Light Probes & Reflection Probes
Global Illumination (GI) simulates how light bounces between surfaces. Without GI, a room lit by a window would have harsh light on one side and pure black everywhere else. GI adds realistic indirect illumination — the red wall reflecting red light onto the white ceiling, for example.
// Configuring lightmapping via script
using UnityEditor;
using UnityEngine;
public class LightmapConfigurator
{
[MenuItem("Lighting/Setup Production Lightmaps")]
public static void ConfigureProductionLightmaps()
{
// Access Lightmap settings
LightmapEditorSettings.lightmapper = LightmapEditorSettings.Lightmapper.ProgressiveGPU;
// Resolution: texels per unit (higher = better quality, longer bake)
LightmapEditorSettings.bakeResolution = 40; // Production quality
LightmapEditorSettings.padding = 2;
// Bounces: how many times light bounces between surfaces
LightmapEditorSettings.maxBounces = 4; // 2-3 for most scenes, 4+ for interiors
// Environment lighting
RenderSettings.ambientMode = UnityEngine.Rendering.AmbientMode.Skybox;
RenderSettings.ambientIntensity = 1.0f;
// Light Probes: capture indirect lighting for dynamic objects
// Place Light Probe Groups in areas where dynamic objects move
// Denser probes near light transitions (doorways, shadows)
Debug.Log("Lightmap settings configured for production bake.");
}
}
// Runtime: Light Probe usage for dynamic objects
public class DynamicObjectLighting : MonoBehaviour
{
// Light Probes automatically affect objects with:
// - MeshRenderer.lightProbeUsage = LightProbeUsage.BlendProbes
// This interpolates between nearby probes for smooth indirect lighting
// on dynamic (non-static) objects
// Reflection Probes: capture reflections for PBR materials
// - Place at eye height in each distinct area
// - Set to Baked for static environments
// - Set to Realtime (expensive!) only for dynamic reflections
// - Use Box Projection for indoor probes (parallax-correct reflections)
}
Common Mistake: Forgetting Light Probes for dynamic objects. Baked lighting only affects static objects via lightmaps. If your player character or enemies walk through a baked scene without Light Probes, they'll appear uniformly lit (flat) while the environment looks beautifully lit around them. Always place a Light Probe Group with probes at key positions — especially near doors, under trees, and at lighting transitions.
7. History: From Fixed Function to SRP
| Era |
Rendering Approach |
Key Milestones |
| Pre-2000 |
Fixed Function Pipeline |
Hardwired GPU operations. No shaders. Colors and lighting computed by fixed hardware stages. T&L (Transform and Lighting) units |
| 2001-2006 |
Programmable Shaders (SM 1.0-3.0) |
Vertex and pixel shaders in assembly, then HLSL/GLSL. Per-pixel lighting, normal mapping, shadow maps become possible |
| 2007-2013 |
Modern Shader Model (SM 4.0-5.0) |
Geometry shaders, tessellation, compute shaders. Deferred rendering becomes mainstream. PBR (Physically Based Rendering) emerges |
| 2014-2017 |
Unity Built-in Pipeline maturity |
Unity 5 introduces PBR (Standard Shader), real-time GI, HDR, linear color space. Single pipeline serves all platforms |
| 2018-2020 |
SRP revolution |
LWRP (renamed URP), HDRP introduced. Shader Graph. SRP Batcher for 2-4x CPU rendering speedup. Asset store transition period |
| 2021-2026 |
SRP maturity, ray tracing era |
URP Forward+, HDRP ray tracing production-ready, GPU-driven rendering, APV (Adaptive Probe Volumes) replacing Light Probes, Unity 6 rendering enhancements |
Exercises & Self-Assessment
Exercise 1
Pipeline Comparison Lab
Create the same scene in both URP and HDRP to understand the differences:
- Create a URP project with a simple indoor scene (room with furniture, a window, and a lamp)
- Create the same scene in an HDRP project
- Compare: visual quality, frame time (ms), memory usage, and build size
- Enable volumetric fog in HDRP — what visual improvement does it add?
- Document your findings in a comparison table
Exercise 2
Shader Graph Workshop
Create three custom shaders using Shader Graph:
- Dissolve shader — noise-based dissolve with glowing edges. Expose threshold and edge color as properties
- Toon/Cel shader — 3-band cel shading with configurable shadow color and rim lighting
- Hologram shader — fresnel glow, scanlines, vertex displacement, transparency. Animate with Time node
- Extract the dissolve effect into a Sub Graph so it can be reused in other shaders
- Apply each shader to a 3D model and take screenshots comparing them to the default Lit shader
Exercise 3
Lighting Masterclass
Master Unity's lighting system through systematic experimentation:
- Create an outdoor scene with only a Directional Light (Real-time) — note: no indirect lighting
- Switch the Directional Light to Mixed mode and bake lightmaps — observe the indirect light bouncing
- Add Light Probes and place a dynamic object — compare its lighting with and without probes
- Add Reflection Probes and create a shiny metallic surface — compare reflections with Box Projection on and off
- Profile the scene: what percentage of frame time is spent on lighting?
Exercise 4
Reflective Questions
- Why did Unity create two separate SRP pipelines (URP and HDRP) instead of a single configurable pipeline? What are the architectural tradeoffs?
- Explain why Shader Graph cannot fully replace hand-written HLSL. What specific capabilities require custom code?
- A scene has 50 point lights in a dungeon. Compare the performance implications of using Forward, Forward+, and Deferred rendering paths in URP.
- Your baked lightmaps look great but dynamic characters appear flat and disconnected from the environment. What three systems would you implement to fix this?
- You need to support both mobile (60fps on iPhone 12) and PC (4K, 60fps). How would you structure your shader and lighting approach to serve both from one URP project?
Conclusion & Next Steps
You now understand Unity's rendering architecture from pipeline selection to shader authoring to professional lighting. Here are the key takeaways from Part 9:
- Choose URP for 90%+ of projects — it covers mobile, VR, PC, and consoles with excellent performance. Only choose HDRP for photorealistic or high-end PC/console projects
- SRP Batcher is the single most impactful rendering optimization — always enable it in URP/HDRP projects
- Shader Graph handles most shader needs visually. Use Sub Graphs for reusable effects. Drop into HLSL only when Shader Graph cannot express what you need
- HLSL knowledge is essential for understanding what happens under the hood, debugging shader issues, and writing custom lighting models
- Lighting modes (Baked, Real-time, Mixed) are not mutually exclusive — most production scenes use Mixed lighting with baked GI and real-time shadows
- Light Probes and Reflection Probes bridge the gap between baked environments and dynamic objects — never skip them
- Profile GPU time — rendering is typically the bottleneck. Use the Frame Debugger and GPU Profiler to identify expensive passes
Next in the Series
In Part 10: Data-Oriented Tech Stack (DOTS), we explore Unity's paradigm shift from object-oriented MonoBehaviour to data-oriented ECS. Learn the Entity Component System, the C# Jobs System for multithreading, and the Burst Compiler for native-speed performance — enabling you to handle 100,000+ entities at 60fps.
Continue the Series
Part 10: Data-Oriented Tech Stack (DOTS)
Unlock massive performance with ECS, the Jobs System, and the Burst Compiler for handling 100K+ entities.
Read Article
Part 11: AI & Gameplay Systems
NavMesh navigation, finite state machines, behavior trees, and procedural generation techniques.
Read Article
Part 8: Building & Publishing
Build pipeline, platform optimization, store submission, and monetization strategies.
Read Article