Introduction: The Soul of Your Game
Series Overview: This is Part 6 of our 16-part Unity Game Engine Series. We dive deep into animation — the system that transforms static meshes into living, breathing characters and worlds that respond to player input with fluid, believable motion.
1
Unity Basics & Interface
Editor overview, assets, prefabs, architecture
2
C# Scripting Fundamentals
MonoBehaviour, coroutines, input systems, patterns
3
GameObjects & Components
Transforms, renderers, custom components
4
Physics & Collisions
Rigidbody, colliders, raycasting, forces
5
UI Systems
Canvas, uGUI, UI Toolkit, responsive design
6
Animation & State Machines
Animator, blend trees, IK, Timeline
You Are Here
7
Audio & Visual Effects
AudioSource, particles, VFX Graph, post-processing
8
Building & Publishing
Build pipeline, optimization, platforms, monetization
9
Rendering Pipelines
URP, HDRP, Shader Graph, lighting systems
10
Data-Oriented Tech Stack
ECS, Jobs System, Burst Compiler
11
AI & Gameplay Systems
NavMesh, FSMs, behavior trees, procedural gen
12
Multiplayer & Networking
Netcode, RPCs, latency, prediction
13
Tools & Editor Scripting
Custom editors, debug tools, CI/CD
14
Architecture & Clean Code
Service locators, DI, ScriptableObject architecture
15
Performance Optimization
CPU/GPU profiling, memory, object pooling
16
Production & Industry Practices
Git, Agile, asset pipelines, debugging at scale
Animation is what separates a tech demo from a game. It's the difference between a character that slides across the ground and one that runs, jumps, rolls, and reacts with weight and personality. Great animation communicates intent, feedback, and emotion without a single word.
Unity's animation system — known as Mecanim — is one of the most powerful and flexible animation frameworks available in any game engine. It combines visual state machine editing, blend trees for fluid transitions, Inverse Kinematics for procedural body adjustment, and Timeline for cinematic authoring — all accessible through both visual tools and C# scripting.
Key Insight: Animation is not just visual polish — it's game feel. The responsiveness of a character's attack animation, the weight of a landing, the anticipation before a jump — these tiny details are what players describe as a game feeling "tight" or "floaty." Mastering animation is mastering player experience.
1. Animation Basics
1.1 Animation Clips & Keyframes
An Animation Clip is a single piece of animation data — a walk cycle, an attack swing, a door opening. Each clip contains keyframes that define property values at specific points in time. Unity interpolates between keyframes to create smooth motion.
| Concept |
Description |
Example |
| Keyframe |
A snapshot of a property's value at a specific time |
Position = (0, 2, 0) at frame 30 |
| Interpolation |
How Unity calculates values between keyframes |
Linear, ease-in/out, constant, custom curves |
| Animation Clip |
A collection of keyframed property changes over time |
"Walk" clip: hip bobbing, leg cycling, arm swinging |
| Looping |
Whether the clip repeats seamlessly |
Walk/run loops vs one-shot attacks/deaths |
| Sample Rate |
Keyframes per second (default: 60) |
Higher = smoother but larger file size |
1.2 Animation Window & Recording Mode
Unity's Animation Window is a timeline-based editor for creating and modifying animation clips directly within the editor:
Pro Tip: Use Recording Mode (the red record button) to create keyframes by simply moving objects in the Scene View. Unity automatically captures property changes as keyframes at the current playhead position. This is incredibly intuitive for quick prototyping — move the playhead to frame 30, drag your object to a new position, and a keyframe is created automatically.
1.3 Animation Curves
Animation curves control the easing between keyframes — how fast or slow a property changes over time. They are what make animation feel natural rather than robotic:
using UnityEngine;
public class AnimationCurveExamples : MonoBehaviour
{
// Expose curves in the Inspector for visual editing
[Header("Movement Curves")]
[SerializeField] private AnimationCurve jumpCurve;
[SerializeField] private AnimationCurve bounceCurve;
[SerializeField] private AnimationCurve shakeCurve;
[Header("Settings")]
[SerializeField] private float jumpHeight = 3f;
[SerializeField] private float jumpDuration = 0.8f;
private float jumpTimer = -1f;
private Vector3 startPosition;
public void StartJump()
{
jumpTimer = 0f;
startPosition = transform.position;
}
private void Update()
{
if (jumpTimer < 0f) return;
jumpTimer += Time.deltaTime;
float normalizedTime = jumpTimer / jumpDuration;
if (normalizedTime >= 1f)
{
// Jump complete
transform.position = startPosition;
jumpTimer = -1f;
return;
}
// Evaluate the curve at the current normalized time (0-1)
// The curve shape determines the jump arc:
// - Linear: unrealistic constant speed
// - Ease-out + ease-in: natural parabolic arc
// - Custom: bouncy, floaty, heavy, etc.
float heightMultiplier = jumpCurve.Evaluate(normalizedTime);
transform.position = startPosition + Vector3.up * (heightMultiplier * jumpHeight);
}
// Create curves programmatically
private void CreateBounceCurve()
{
bounceCurve = new AnimationCurve();
bounceCurve.AddKey(new Keyframe(0f, 0f, 0f, 8f)); // Start: fast rise
bounceCurve.AddKey(new Keyframe(0.3f, 1f, 0f, 0f)); // Peak
bounceCurve.AddKey(new Keyframe(0.5f, 0.2f, 0f, 0f)); // First bounce
bounceCurve.AddKey(new Keyframe(0.65f, 0.5f, 0f, 0f)); // Second peak
bounceCurve.AddKey(new Keyframe(0.8f, 0.05f, 0f, 0f)); // Second bounce
bounceCurve.AddKey(new Keyframe(1f, 0f, -0.5f, 0f)); // Settle
}
}
2. The Animator System
2.1 Animator Controller
The Animator Controller is a visual state machine that manages which animation clip plays and how clips transition between each other. Think of it as a flowchart for your character's animations — each box is a state (idle, walk, run, attack) and each arrow is a transition with conditions.
| Component |
Description |
Purpose |
| State |
A node in the graph linked to an animation clip or blend tree |
Defines what animation plays |
| Transition |
An arrow connecting two states with conditions |
Defines when and how to switch animations |
| Parameter |
A variable (bool, int, float, trigger) that drives transitions |
Bridge between gameplay code and animation |
| Layer |
Independent animation graphs that can blend (e.g., upper body + lower body) |
Play different animations on different body parts simultaneously |
| Avatar Mask |
Defines which bones a layer affects |
Upper body aiming while lower body runs |
2.2 Parameters & Transitions
using UnityEngine;
[RequireComponent(typeof(Animator))]
public class CharacterAnimatorController : MonoBehaviour
{
private Animator animator;
// Cache parameter hashes for performance (avoid string lookups every frame)
private static readonly int SpeedHash = Animator.StringToHash("Speed");
private static readonly int IsGroundedHash = Animator.StringToHash("IsGrounded");
private static readonly int IsAttackingHash = Animator.StringToHash("IsAttacking");
private static readonly int JumpTriggerHash = Animator.StringToHash("Jump");
private static readonly int HurtTriggerHash = Animator.StringToHash("Hurt");
private static readonly int DeathTriggerHash = Animator.StringToHash("Death");
private static readonly int AttackIndexHash = Animator.StringToHash("AttackIndex");
private void Awake()
{
animator = GetComponent<Animator>();
}
// Called from your character controller every frame
public void UpdateAnimation(float moveSpeed, bool isGrounded)
{
// Float parameter: drives blend tree (idle/walk/run)
animator.SetFloat(SpeedHash, moveSpeed, 0.1f, Time.deltaTime);
// Bool parameter: controls ground/air state transitions
animator.SetBool(IsGroundedHash, isGrounded);
}
public void TriggerJump()
{
// Trigger: fires once, auto-resets (perfect for one-shot actions)
animator.SetTrigger(JumpTriggerHash);
}
public void TriggerAttack(int comboIndex)
{
// Int parameter: select which attack in a combo chain
animator.SetInteger(AttackIndexHash, comboIndex);
animator.SetTrigger(JumpTriggerHash); // Reusing trigger for attack
}
public void TriggerHurt()
{
animator.SetTrigger(HurtTriggerHash);
}
public void TriggerDeath()
{
animator.SetTrigger(DeathTriggerHash);
// Disable the animator to freeze on death pose
// animator.enabled = false; // or use a "Dead" state with no exit
}
// Query current animation state
public bool IsInState(string stateName, int layer = 0)
{
return animator.GetCurrentAnimatorStateInfo(layer)
.IsName(stateName);
}
// Check if a transition is currently in progress
public bool IsTransitioning(int layer = 0)
{
return animator.IsInTransition(layer);
}
}
Performance Tip: Always use Animator.StringToHash() to convert parameter names to integer hashes once, then use the hash for all subsequent calls. String-based lookups like SetFloat("Speed", value) allocate garbage and are slower than SetFloat(SpeedHash, value). In a game with 100 animated characters, this optimization adds up significantly.
Transition Settings Matter: Two critical transition properties that new developers overlook: Has Exit Time determines whether the current animation must finish before transitioning (good for attacks, bad for interrupts). Transition Duration controls how long the blend between states takes — too short feels snappy, too long feels sluggish. For responsive gameplay, set Has Exit Time to false on interrupt transitions and keep transition duration between 0.05-0.15 seconds.
3. Blend Trees
Blend Trees are the secret weapon for creating smooth, fluid character movement. Instead of abruptly switching between walk and run animations, a blend tree smoothly interpolates between multiple clips based on parameter values.
3.1 1D Blend Trees
A 1D blend tree blends animations along a single axis — perfect for speed-based movement:
| Parameter Value |
Animation |
Blend Result |
| 0.0 |
Idle |
Standing still |
| 0.3 |
Idle + Walk blend |
Starting to move, slight shuffling |
| 0.5 |
Walk |
Normal walking pace |
| 0.75 |
Walk + Run blend |
Jogging speed |
| 1.0 |
Run |
Full sprint |
3.2 2D Blend Trees
2D blend trees blend animations along two axes — essential for directional movement (forward/backward + left/right):
using UnityEngine;
public class DirectionalMovementAnimator : MonoBehaviour
{
private Animator animator;
// 2D Blend Tree parameters
private static readonly int MoveXHash = Animator.StringToHash("MoveX");
private static readonly int MoveZHash = Animator.StringToHash("MoveZ");
[SerializeField] private float dampTime = 0.1f;
private void Awake()
{
animator = GetComponent<Animator>();
}
public void UpdateMovementAnimation(Vector3 localVelocity, float maxSpeed)
{
// Normalize velocity relative to max speed
float normalizedX = localVelocity.x / maxSpeed; // Strafe left/right
float normalizedZ = localVelocity.z / maxSpeed; // Forward/backward
// Smoothly update blend tree parameters
// dampTime creates smooth transitions instead of instant snapping
animator.SetFloat(MoveXHash, normalizedX, dampTime, Time.deltaTime);
animator.SetFloat(MoveZHash, normalizedZ, dampTime, Time.deltaTime);
}
}
// Blend tree layout (configured in Animator window):
// 2D Freeform Directional blend with positions:
//
// Forward (0, 1)
// Run_F
// |
// Strafe_L (-1,0)-- Idle (0,0) -- Strafe_R (1,0)
// |
// Backward (0, -1)
// Run_B
//
// Plus diagonals: RunForwardLeft (-0.7, 0.7), etc.
Blend Tree Types Compared
Choosing the Right 2D Blend Type
| Type |
Best For |
How It Works |
| 2D Simple Directional |
Single motion per direction |
One clip at each compass point. Fast but limited. |
| 2D Freeform Directional |
Multiple speeds per direction |
Multiple clips at different magnitudes in same direction (walk + run forward). |
| 2D Freeform Cartesian |
Independent axes (lean + speed) |
Each axis represents a different concept. Best for non-directional blending. |
Blend Trees
2D Blending
Character Movement
4. State Machines
4.1 Character State Design
A well-designed state machine is the backbone of character behavior. Each state represents a distinct mode of operation with its own animation, movement rules, and transition conditions:
| State |
Animation |
Transitions To |
Condition |
| Idle |
Idle loop (breathing, shifting weight) |
Walk, Jump, Attack, Hurt, Death |
Speed > 0.1, Jump trigger, Attack trigger, etc. |
| Walk/Run |
Locomotion blend tree |
Idle, Jump, Attack, Hurt, Death |
Speed < 0.1, Jump trigger, etc. |
| Jump |
Jump start, air loop, land |
Idle, Walk (on land) |
IsGrounded = true |
| Attack |
Attack combo clips |
Idle (after exit time) |
Has Exit Time = true |
| Hurt |
Flinch/stagger animation |
Idle (after exit time) |
Has Exit Time = true |
| Death |
Death animation (no loop) |
None (terminal state) |
No transitions out |
using UnityEngine;
// Code-side state machine that mirrors the Animator states
// This provides game logic coordination with animation states
public class CharacterStateMachine : MonoBehaviour
{
public enum CharacterState
{
Idle, Locomotion, Jumping, Attacking, Hurt, Dead
}
[SerializeField] private CharacterState currentState = CharacterState.Idle;
private Animator animator;
private CharacterAnimatorController animController;
public CharacterState CurrentState => currentState;
private void Awake()
{
animator = GetComponent<Animator>();
animController = GetComponent<CharacterAnimatorController>();
}
public bool TryTransition(CharacterState newState)
{
// Validate transition rules
if (!IsTransitionValid(currentState, newState))
return false;
CharacterState previousState = currentState;
currentState = newState;
OnStateEnter(newState, previousState);
return true;
}
private bool IsTransitionValid(CharacterState from, CharacterState to)
{
// Dead is a terminal state — no transitions out
if (from == CharacterState.Dead) return false;
// Hurt and Death can interrupt anything
if (to == CharacterState.Hurt || to == CharacterState.Dead) return true;
// Can't attack while jumping (in this design)
if (from == CharacterState.Jumping && to == CharacterState.Attacking) return false;
// Can't move while attacking (committed animations)
if (from == CharacterState.Attacking && to == CharacterState.Locomotion) return false;
return true;
}
private void OnStateEnter(CharacterState state, CharacterState previous)
{
switch (state)
{
case CharacterState.Jumping:
animController.TriggerJump();
break;
case CharacterState.Attacking:
animController.TriggerAttack(0);
break;
case CharacterState.Hurt:
animController.TriggerHurt();
break;
case CharacterState.Dead:
animController.TriggerDeath();
break;
}
Debug.Log($"State: {previous} -> {state}");
}
}
4.2 Sub-State Machines & Any State
Sub-state machines help organize complex Animator Controllers by grouping related states. Any State transitions allow certain events (like taking damage) to interrupt any current state:
Design Pattern: Use Sub-State Machines to organize states by category. Group all locomotion states (idle, walk, run, sprint, crouch) into a "Locomotion" sub-state machine. Group combat states (attack_1, attack_2, attack_3, block, parry) into a "Combat" sub-state machine. This keeps the top-level Animator graph clean and readable even for characters with 50+ animation states.
Any State Warning: Any State transitions are powerful but dangerous. An Any State transition to "Hurt" means any animation can be interrupted by taking damage — including the Hurt animation itself, creating an infinite loop. Always add conditions to prevent self-transitions: add a "CanBeHurt" bool parameter that you set to false during the Hurt state.
5. Advanced Animation
5.1 Inverse Kinematics (IK)
Inverse Kinematics procedurally adjusts bone positions to reach target points — making characters interact naturally with the environment. Instead of animating every possible hand position, IK calculates the bone chain needed to reach a target in real time.
using UnityEngine;
[RequireComponent(typeof(Animator))]
public class CharacterIK : MonoBehaviour
{
private Animator animator;
[Header("Foot IK")]
[SerializeField] private bool enableFootIK = true;
[SerializeField] private float footOffset = 0.1f;
[SerializeField] private float raycastDistance = 1.5f;
[SerializeField] private LayerMask groundLayer;
[Header("Look At IK")]
[SerializeField] private bool enableLookAt = true;
[SerializeField] private Transform lookTarget;
[SerializeField] [Range(0, 1)] private float lookWeight = 0.7f;
[Header("Hand IK")]
[SerializeField] private bool enableHandIK = false;
[SerializeField] private Transform rightHandTarget;
[SerializeField] private Transform leftHandTarget;
private void Awake()
{
animator = GetComponent<Animator>();
}
// Called by Unity's animation system when IK pass is processed
private void OnAnimatorIK(int layerIndex)
{
if (animator == null) return;
// === LOOK AT IK ===
if (enableLookAt && lookTarget != null)
{
animator.SetLookAtWeight(lookWeight, 0.3f, 0.6f, 0.8f, 0.5f);
animator.SetLookAtPosition(lookTarget.position);
}
// === FOOT IK (ground conformity) ===
if (enableFootIK)
{
AdjustFoot(AvatarIKGoal.LeftFoot);
AdjustFoot(AvatarIKGoal.RightFoot);
}
// === HAND IK (grabbing objects, weapons) ===
if (enableHandIK)
{
if (rightHandTarget != null)
{
animator.SetIKPositionWeight(AvatarIKGoal.RightHand, 1f);
animator.SetIKRotationWeight(AvatarIKGoal.RightHand, 1f);
animator.SetIKPosition(AvatarIKGoal.RightHand, rightHandTarget.position);
animator.SetIKRotation(AvatarIKGoal.RightHand, rightHandTarget.rotation);
}
if (leftHandTarget != null)
{
animator.SetIKPositionWeight(AvatarIKGoal.LeftHand, 1f);
animator.SetIKRotationWeight(AvatarIKGoal.LeftHand, 1f);
animator.SetIKPosition(AvatarIKGoal.LeftHand, leftHandTarget.position);
animator.SetIKRotation(AvatarIKGoal.LeftHand, leftHandTarget.rotation);
}
}
}
private void AdjustFoot(AvatarIKGoal foot)
{
// Raycast down from the foot bone to find the ground
Vector3 footPos = animator.GetIKPosition(foot);
Ray ray = new Ray(footPos + Vector3.up, Vector3.down);
if (Physics.Raycast(ray, out RaycastHit hit, raycastDistance, groundLayer))
{
// Place foot on the ground surface
Vector3 targetPos = hit.point + Vector3.up * footOffset;
animator.SetIKPositionWeight(foot, 1f);
animator.SetIKPosition(foot, targetPos);
// Align foot rotation to ground normal (slopes, stairs)
Quaternion footRotation = Quaternion.LookRotation(
Vector3.ProjectOnPlane(transform.forward, hit.normal), hit.normal
);
animator.SetIKRotationWeight(foot, 1f);
animator.SetIKRotation(foot, footRotation);
}
}
}
5.2 Root Motion vs Script-Driven Movement
| Approach |
How It Works |
Best For |
Drawbacks |
| Root Motion |
Movement baked into animation. The animation drives the character's position. |
Realistic movement (Souls-like), cinematic quality, weight and momentum |
Less responsive to input, requires animation for every movement variation |
| Script-Driven |
Code controls position directly via transform or Rigidbody. Animation is cosmetic only. |
Responsive, arcade-style gameplay (platformers, shooters), precise control |
Potential for foot sliding if animation speed doesn't match movement speed |
using UnityEngine;
// Root Motion controller — animation drives movement
[RequireComponent(typeof(Animator), typeof(Rigidbody))]
public class RootMotionController : MonoBehaviour
{
private Animator animator;
private Rigidbody rb;
[SerializeField] private bool useRootMotion = true;
private void Awake()
{
animator = GetComponent<Animator>();
rb = GetComponent<Rigidbody>();
animator.applyRootMotion = useRootMotion;
}
// Called by Unity when root motion is calculated
private void OnAnimatorMove()
{
if (useRootMotion)
{
// Apply the animation's root motion delta to the Rigidbody
// This gives physically correct, animation-driven movement
rb.MovePosition(rb.position + animator.deltaPosition);
rb.MoveRotation(rb.rotation * animator.deltaRotation);
}
}
}
5.3 Timeline & Cinemachine
Timeline is Unity's visual sequencing tool for creating cutscenes, cinematics, and scripted gameplay sequences. Think of it as a non-linear video editor built into Unity:
using UnityEngine;
using UnityEngine.Playables;
using UnityEngine.Timeline;
public class CutsceneController : MonoBehaviour
{
[SerializeField] private PlayableDirector cutsceneDirector;
[SerializeField] private PlayableDirector introCutscene;
[SerializeField] private PlayableDirector bossCutscene;
private bool cutscenePlaying = false;
public void PlayCutscene(PlayableDirector director)
{
if (cutscenePlaying) return;
cutscenePlaying = true;
// Disable player controls during cutscene
FindObjectOfType<CharacterStateMachine>().enabled = false;
director.stopped += OnCutsceneComplete;
director.Play();
}
private void OnCutsceneComplete(PlayableDirector director)
{
director.stopped -= OnCutsceneComplete;
cutscenePlaying = false;
// Re-enable player controls
FindObjectOfType<CharacterStateMachine>().enabled = true;
Debug.Log("Cutscene complete — returning control to player");
}
// Skip cutscene functionality
private void Update()
{
if (cutscenePlaying && Input.GetKeyDown(KeyCode.Escape))
{
cutsceneDirector.time = cutsceneDirector.duration;
cutsceneDirector.Evaluate();
cutsceneDirector.Stop();
}
}
}
5.4 Animation Events
Animation Events fire C# methods at precise frames during an animation clip — essential for synchronizing SFX, VFX, and gameplay logic with animation:
using UnityEngine;
public class AnimationEventReceiver : MonoBehaviour
{
[Header("Audio")]
[SerializeField] private AudioSource audioSource;
[SerializeField] private AudioClip[] footstepSounds;
[SerializeField] private AudioClip swordSwingSound;
[SerializeField] private AudioClip impactSound;
[Header("VFX")]
[SerializeField] private ParticleSystem dustParticle;
[SerializeField] private ParticleSystem swordTrail;
[SerializeField] private Transform weaponHitbox;
// Called by Animation Event on foot-down frames
public void OnFootstep()
{
if (footstepSounds.Length == 0) return;
AudioClip clip = footstepSounds[Random.Range(0, footstepSounds.Length)];
audioSource.PlayOneShot(clip, 0.5f);
dustParticle.Play();
}
// Called at the start of an attack swing
public void OnAttackStart()
{
swordTrail.Play();
audioSource.PlayOneShot(swordSwingSound, 0.7f);
}
// Called at the peak damage frame of an attack
public void OnAttackHit()
{
// Enable the weapon's damage collider for a brief window
weaponHitbox.gameObject.SetActive(true);
}
// Called at the end of the attack recovery
public void OnAttackEnd()
{
swordTrail.Stop();
weaponHitbox.gameObject.SetActive(false);
}
// Called when a landing animation plays
public void OnLand(float intensity)
{
// The float parameter comes from the Animation Event's data
audioSource.PlayOneShot(impactSound, intensity);
CameraShake.Instance?.Shake(intensity * 0.3f, 0.2f);
}
}
6. History: Legacy Animation to Mecanim
| Era |
System |
Approach |
Status |
| 2005-2012 |
Legacy Animation |
Code-driven animation playback with Animation.Play(), CrossFade(), blend weights. No visual state machine. |
Deprecated. Still usable for simple animations (UI, props) but not recommended for characters. |
| 2012-Present |
Mecanim (Animator) |
Visual state machine, Animator Controller, blend trees, IK, Avatar system, humanoid retargeting, layers, masks. |
Current standard. Continually improved with Timeline, Animation Rigging, and performance optimizations. |
Case Study
Cuphead — Hand-Drawn Animation in Unity
Studio MDHR's Cuphead features over 50,000 hand-drawn animation frames — all managed through Unity's animation system. The team drew every frame on paper, scanned them, cleaned them digitally, and imported them as sprite sheets. Key technical insights:
- Frame-by-frame sprite animation using Unity's 2D animation tools, not bone-based animation
- Animator Controllers managed hundreds of states per character (idle, run, duck, parry, shoot, super, death, intro, etc.)
- Animation Events synchronized sound effects precisely with visual hits, explosions, and character reactions
- Custom animation timing — the team matched 1930s cartoon frame rates (24fps with selective 12fps holds) for authentic vintage feel
Cuphead
Hand-Drawn
Studio MDHR
50K+ Frames
Case Study
Ori and the Blind Forest — Blend Trees for Fluid Movement
Moon Studios' Ori and the Blind Forest is celebrated for its incredibly fluid character movement. The game demonstrates masterful use of blend trees and animation layering:
- Speed-based blend trees seamlessly transition between idle, walk, run, and sprint without visible animation pops
- Air control animations blend between rising, floating, and falling based on vertical velocity
- Anticipation and follow-through — Ori's movement includes subtle animation lead-in and overshoot, following Disney's 12 principles of animation
- Layered animations allow attacking while running, with upper body overrides that don't interrupt locomotion
Ori
Moon Studios
Fluid Movement
Blend Trees
Exercises & Self-Assessment
Exercise 1
Character Locomotion Blend Tree
Create a character with smooth locomotion using blend trees:
- Import a humanoid character model (use Unity's free Starter Assets or Mixamo)
- Create a 1D blend tree with Idle, Walk, and Run animations controlled by a "Speed" float parameter
- Write a script that sets the Speed parameter based on input magnitude
- Add a sprint modifier (hold Shift) that pushes Speed to maximum
- Verify smooth transitions by gradually increasing/decreasing speed
Exercise 2
Complete Character State Machine
Build a full character Animator Controller:
- Create states: Idle, Locomotion (blend tree), Jump (start/air/land), Attack (3-hit combo), Hurt, Death
- Configure transitions with appropriate Has Exit Time and Transition Duration settings
- Add an Any State transition to Hurt (with CanBeHurt bool guard)
- Implement the C# controller that sets all parameters based on gameplay
- Test every possible state transition to verify no broken paths
Exercise 3
Foot IK on Uneven Terrain
Implement Inverse Kinematics for realistic foot placement:
- Create an uneven terrain with slopes, steps, and ledges
- Implement the OnAnimatorIK callback for foot placement using raycasts
- Adjust the character's hip height based on the average foot positions
- Compare movement with and without IK — observe how feet penetrate or float without it
- Add Look At IK so the character's head tracks a target object
Exercise 4
Reflective Questions
- When would you choose root motion over script-driven movement? Describe a game genre where each approach is clearly superior.
- An attack animation should not be cancellable by walking, but should be cancellable by taking damage. How do you configure the Animator transitions to achieve this?
- Your 2D blend tree has 8 directional animations, but movement in the 45-degree diagonals looks wrong. What blend tree type should you use, and how would you configure it?
- A designer asks for a cutscene where the camera orbits a character while they perform an animation, then seamlessly returns to gameplay. Which Unity tools would you combine, and how?
- You have 200 NPCs on screen, each with a full Animator Controller. Performance is dropping. List three optimization strategies for animation at scale.
Conclusion & Next Steps
You now have a comprehensive understanding of Unity's animation systems. Here are the key takeaways from Part 6:
- Animation Clips store keyframed data — understand keyframes, interpolation curves, and looping for both imported and editor-created animations
- Animator Controllers are visual state machines that manage animation flow — states, transitions, parameters (bool/int/float/trigger), and layers
- Blend Trees create fluid movement by interpolating between clips — 1D for speed, 2D for directional movement, with proper damping for smooth transitions
- State Machines should mirror your game logic — validate transitions, handle interrupts properly, and use sub-state machines for organization
- Inverse Kinematics adds procedural realism — foot placement on terrain, hand grip on objects, head tracking for look-at behavior
- Root Motion vs Script-Driven is a fundamental design choice — root motion for realistic weight, script-driven for responsive arcade feel
- Timeline & Cinemachine enable cinematic sequences — combine them for cutscenes, camera animations, and scripted events
- Animation Events synchronize gameplay with animation — footstep sounds, attack hitbox activation, VFX triggers
Next in the Series
In Part 7: Audio & Visual Effects, we'll add the sensory layer that makes games truly immersive — AudioSource and AudioMixer systems, particle effects for fire/smoke/magic, the GPU-powered VFX Graph for massive simulations, and post-processing effects like bloom, color grading, and ambient occlusion.
Continue the Series
Part 7: Audio & Visual Effects
AudioSource, particle systems, VFX Graph, post-processing, and creating immersive audiovisual experiences.
Read Article
Part 5: UI Systems
Canvas, uGUI, UI Toolkit, RectTransform, layout groups, responsive design, and dynamic UI generation.
Read Article
Part 8: Building & Publishing
Build pipeline, platform optimization, publishing workflows, and monetization strategies.
Read Article