Series Context: This is Part 17 of the 20-part CMSIS Mastery Series. Parts 1–16 built your firmware — this part and the ones that follow focus on verifying it, optimising it, and maintaining it professionally. Testing is not optional in embedded systems.
1
Overview & ARM Cortex-M Ecosystem
CMSIS layers, Cortex-M families, memory map, toolchains
2
CMSIS-Core: Registers, NVIC & SysTick
core_cmX.h, register access, interrupt controller, SysTick timer
3
Startup Code, Linker Scripts & Vector Table
Reset handler, BSS init, scatter files, boot process
4
CMSIS-RTOS2: Threads, Mutexes & Semaphores
Thread management, synchronization primitives, scheduling
5
CMSIS-RTOS2: Message Queues & Event Flags
Inter-thread comms, ISR-to-thread, real-time design patterns
6
CMSIS-DSP: Filters, FFT & Math Functions
FIR/IIR filters, FFT, SIMD optimizations
7
CMSIS-Driver: UART, SPI & I2C
Driver abstraction layer, callbacks, DMA integration
8
CMSIS-Pack & Software Components
Pack files, device support, dependency management
9
Debugging with CMSIS-DAP & CoreSight
SWD/JTAG, HardFault analysis, ITM tracing
10
Portable Firmware: Multi-Vendor Projects
HAL vs CMSIS, cross-platform BSPs, reusable libraries
11
Interrupts, Concurrency & Real-Time Constraints
Interrupt latency, critical sections, lock-free programming
12
Memory Management in Embedded Systems
Static vs dynamic, heap fragmentation, memory pools
13
Low Power & Energy Optimization
Sleep modes, clock gating, tickless RTOS, power profiling
14
DMA & High-Performance Data Handling
DMA basics, peripheral transfers, zero-copy techniques
15
Security: ARMv8-M & TrustZone
Secure/non-secure worlds, secure boot, firmware protection
16
Bootloaders & Firmware Updates
OTA updates, dual-bank flash, fail-safe strategies
17
Testing & Validation
Unity/Ceedling unit tests, HIL testing, integration testing
You Are Here
18
Performance Optimization
Compiler flags, inline assembly, cache (M7/M33), profiling
19
Embedded Software Architecture
Layered design, event-driven, state machines, component-based
20
Tooling & Workflow (Professional Level)
CI/CD for embedded, MISRA, static analysis, Doxygen
Testing Fundamentals for Embedded Systems
Testing embedded firmware is fundamentally harder than testing server-side software. Hardware is expensive, not always available, non-deterministic when hardware faults occur, and bugs can destroy physical components. Yet untested firmware reaches production constantly — because embedded engineers have historically lacked the tooling culture of their software counterparts.
The modern approach treats embedded firmware like any other software: test early, test often, automate everything. The key insight is that most firmware logic can be tested without hardware — if you write it with testability in mind. This means separating hardware-dependent code from pure logic, and using mocks or stubs to replace hardware during host-based unit tests.
The Embedded Testing Pyramid: Unit tests (host-based, fast, cheap) form the base. Integration tests (on target or HIL) sit in the middle. Full system tests (HIL with real stimuli) are at the top. Invest most effort at the base — catching bugs on a Linux host is 100x cheaper than debugging on hardware.
The table below maps testing techniques to their practical characteristics — helping you decide where to invest effort for a given project risk profile.
| Technique |
Cost |
Feedback Speed |
Coverage Scope |
Tooling |
| Unit Tests (host) |
Very Low |
Seconds |
Module logic only |
Unity, CppUTest, Google Test |
| Unit Tests (on-target) |
Low |
Minutes |
Logic + interrupt timing |
Unity + semihosting, SEGGER RTT |
| Integration Tests (on-target) |
Medium |
Minutes–Hours |
Multi-module interaction |
Custom harness, Ceedling runners |
| HIL Testing |
High |
Hours |
Full system with real stimuli |
dSPACE, NI TestStand, custom Raspberry Pi rigs |
| Regression Testing |
Low (after setup) |
CI pipeline duration |
Depends on test suite |
GitHub Actions, Jenkins, GitLab CI |
| Stress / Soak Testing |
Medium |
Days–Weeks |
Timing, memory, temperature drift |
Environmental chambers, long-run HIL |
Unity & Ceedling Unit Tests
Unity is the de facto C unit testing framework for embedded systems. It is a single .c file and a single .h — minimal dependencies, no C++ required, no dynamic allocation. It provides the assertion macros you need (TEST_ASSERT_EQUAL, TEST_ASSERT_NOT_NULL, TEST_ASSERT_FLOAT_WITHIN), a test runner structure, and a results summary compatible with CI systems via JUnit XML output.
Unity Framework Basics: Ring Buffer Unit Test
The following example tests a ring buffer module — a common firmware component. Notice setUp and tearDown run before and after every test case, ensuring isolation.
/**
* test_ring_buffer.c
* Unity unit tests for ring_buffer.c module.
* Compile and run on host: gcc test_ring_buffer.c ring_buffer.c unity.c -o test_rb && ./test_rb
*/
#include "unity.h"
#include "ring_buffer.h"
/* Test fixture — one instance per test */
static ring_buffer_t g_rb;
static uint8_t g_storage[8];
/* Called before every TEST_CASE */
void setUp(void) {
ring_buffer_init(&g_rb, g_storage, sizeof(g_storage));
}
/* Called after every TEST_CASE */
void tearDown(void) {
/* Nothing to free — static allocation */
}
/* ───── Tests ───── */
void test_Init_EmptyBuffer(void) {
TEST_ASSERT_EQUAL_UINT32(0, ring_buffer_count(&g_rb));
TEST_ASSERT_EQUAL_UINT32(sizeof(g_storage), ring_buffer_capacity(&g_rb));
TEST_ASSERT_TRUE(ring_buffer_is_empty(&g_rb));
TEST_ASSERT_FALSE(ring_buffer_is_full(&g_rb));
}
void test_PushOne_CountIsOne(void) {
ring_buffer_push(&g_rb, 0xABU);
TEST_ASSERT_EQUAL_UINT32(1, ring_buffer_count(&g_rb));
TEST_ASSERT_FALSE(ring_buffer_is_empty(&g_rb));
}
void test_PushPop_DataIntact(void) {
uint8_t out = 0;
ring_buffer_push(&g_rb, 0x42U);
TEST_ASSERT_EQUAL_INT(RING_BUFFER_OK, ring_buffer_pop(&g_rb, &out));
TEST_ASSERT_EQUAL_UINT8(0x42U, out);
TEST_ASSERT_TRUE(ring_buffer_is_empty(&g_rb));
}
void test_FillToCapacity_IsFull(void) {
for (uint32_t i = 0; i < sizeof(g_storage); i++) {
TEST_ASSERT_EQUAL_INT(RING_BUFFER_OK, ring_buffer_push(&g_rb, (uint8_t)i));
}
TEST_ASSERT_TRUE(ring_buffer_is_full(&g_rb));
TEST_ASSERT_EQUAL_INT(RING_BUFFER_FULL, ring_buffer_push(&g_rb, 0xFFU));
}
void test_PopFromEmpty_ReturnsError(void) {
uint8_t out = 0;
TEST_ASSERT_EQUAL_INT(RING_BUFFER_EMPTY, ring_buffer_pop(&g_rb, &out));
}
void test_WrapAround_DataOrder(void) {
/* Fill, drain half, fill again — tests wrap-around */
for (uint32_t i = 0; i < sizeof(g_storage); i++) {
ring_buffer_push(&g_rb, (uint8_t)i);
}
for (uint32_t i = 0; i < 4; i++) {
uint8_t dummy;
ring_buffer_pop(&g_rb, &dummy);
}
ring_buffer_push(&g_rb, 0xAA);
ring_buffer_push(&g_rb, 0xBB);
uint8_t out;
ring_buffer_pop(&g_rb, &out);
TEST_ASSERT_EQUAL_UINT8(4U, out); /* First remaining item */
}
/* Unity test runner — generated by Ceedling or written manually */
int main(void) {
UNITY_BEGIN();
RUN_TEST(test_Init_EmptyBuffer);
RUN_TEST(test_PushOne_CountIsOne);
RUN_TEST(test_PushPop_DataIntact);
RUN_TEST(test_FillToCapacity_IsFull);
RUN_TEST(test_PopFromEmpty_ReturnsError);
RUN_TEST(test_WrapAround_DataOrder);
return UNITY_END();
}
Ceedling project.yml Configuration
Ceedling is the build system layer on top of Unity and CMock. It handles test discovery, compilation, runner generation, and reporting automatically. The project.yml drives everything — including cross-compilation for on-target tests.
# project.yml — Ceedling configuration for STM32F4 embedded project
---
:project:
:use_exceptions: FALSE
:use_mocks: TRUE
:build_root: build/test
:release_build: FALSE
:test_file_prefix: test_
:which_ceedling: gem # Use gem-installed Ceedling
:environment: []
:extension:
:executable: .out
:paths:
:test:
- test/**
:source:
- src/**
:include:
- src/
- CMSIS/Core/Include/
- CMSIS/Device/ST/STM32F4xx/Include/
- test/support/ # Stubs and fake headers
:defines:
:common: &common_defines
- STM32F407xx
- UNIT_TESTING # Guards hardware-only code
- TEST # Unity convention
:test:
- *common_defines
:cmock:
:mock_prefix: mock_
:when_no_prototypes: :warn
:enforce_strict_ordering: TRUE
:plugins:
- :ignore
- :callback
- :return_thru_ptr
:gcov:
:reports:
- HtmlDetailed
- Text
:gcovr:
:html_medium_threshold: 75
:html_high_threshold: 90
:tools:
# Use host GCC for unit tests (not arm-none-eabi)
:test_compiler:
:executable: gcc
:arguments:
- -I"$": COLLECTION_PATHS_TEST_SUPPORT_SOURCE_INCLUDE_VENDOR
- -D$: COLLECTION_DEFINES_TEST_AND_VENDOR
- -c "${1}"
- -o "${2}"
- -Wall
- -Wextra
- --coverage # GCOV instrumentation
:test_linker:
:executable: gcc
:arguments:
- "${1}"
- -o "${2}"
- --coverage
:plugins:
:load_paths:
- "#{Ceedling.load_path}"
:enabled:
- stdout_pretty_tests_report
- module_generator
- gcov
# Run tests: ceedling test:all
# Run with coverage: ceedling gcov:all utils:gcov
Key Pattern: Notice UNIT_TESTING define in the Ceedling config. In your production code, wrap hardware-direct register access with #ifndef UNIT_TESTING guards so it compiles out during host tests — replacing it with stubs in test/support/.
Mock Peripherals with CMock
CMock auto-generates mock implementations from header files. Given Driver_USART.h (CMSIS-Driver interface), CMock produces mock_Driver_USART.c/.h that lets you set expectations, inject return values, and verify call sequences — entirely in software, without real UART hardware.
The generated mock header provides: Driver_USART_Initialize_ExpectAndReturn(), Driver_USART_Send_ExpectWithArrayAndReturn(), Driver_USART_GetStatus_ExpectAndReturn(). You inject the mock by linking mock_Driver_USART.o instead of the real driver object.
/**
* test_uart_logger.c
* Tests the uart_logger module using a CMock-generated mock
* of the CMSIS-Driver USART interface.
*
* CMock generates mock_Driver_USART.h from Driver_USART.h:
* ruby cmock.rb Driver_USART.h
*/
#include "unity.h"
#include "mock_Driver_USART.h" /* CMock-generated */
#include "uart_logger.h" /* Module under test */
/* CMSIS-Driver capability struct used by mock */
static ARM_USART_CAPABILITIES s_caps = {
.asynchronous = 1,
.flow_control_rts = 0,
.flow_control_cts = 0,
};
void setUp(void) {
/* CMock resets all expectations automatically */
}
void tearDown(void) {
/* Verify all expected calls were made */
}
/* Test: logger_init calls USART Initialize with correct baud rate */
void test_LoggerInit_CallsUsartInitialize(void) {
ARM_DRIVER_USART mock_drv; /* Mock instance */
/* Set up expectations on the mock */
Driver_USART_GetCapabilities_ExpectAndReturn(&s_caps);
Driver_USART_Initialize_ExpectAndReturn(uart_logger_event_cb, ARM_DRIVER_OK);
Driver_USART_PowerControl_ExpectAndReturn(ARM_POWER_FULL, ARM_DRIVER_OK);
Driver_USART_Control_ExpectAndReturn(
ARM_USART_MODE_ASYNCHRONOUS | ARM_USART_DATA_BITS_8 |
ARM_USART_STOP_BITS_1 | ARM_USART_PARITY_NONE |
ARM_USART_FLOW_CONTROL_NONE,
115200U,
ARM_DRIVER_OK
);
/* Call the function under test */
int32_t rc = uart_logger_init(115200U);
TEST_ASSERT_EQUAL_INT32(UART_LOGGER_OK, rc);
}
/* Test: logger_write sends correct byte sequence */
void test_LoggerWrite_SendsBytes(void) {
const char *msg = "HELLO";
const uint32_t len = 5;
Driver_USART_Send_ExpectWithArrayAndReturn(
(const void *)msg, len, len, /* ptr, num, num_verified */
ARM_DRIVER_OK
);
int32_t rc = uart_logger_write((const uint8_t *)msg, len);
TEST_ASSERT_EQUAL_INT32(UART_LOGGER_OK, rc);
}
/* Test: driver error is propagated correctly */
void test_LoggerWrite_DriverError_ReturnsError(void) {
Driver_USART_Send_IgnoreAndReturn(ARM_DRIVER_ERROR);
int32_t rc = uart_logger_write((const uint8_t *)"X", 1);
TEST_ASSERT_EQUAL_INT32(UART_LOGGER_ERR_DRIVER, rc);
}
int main(void) {
UNITY_BEGIN();
RUN_TEST(test_LoggerInit_CallsUsartInitialize);
RUN_TEST(test_LoggerWrite_SendsBytes);
RUN_TEST(test_LoggerWrite_DriverError_ReturnsError);
return UNITY_END();
}
CMock
What CMock Generates
For every function in Driver_USART.h, CMock generates: _Expect, _ExpectAndReturn, _ExpectWithArray, _IgnoreAndReturn, and _StubWithCallback variants. You set expectations — CMock verifies them in tearDown via CMockVerifyAll().
Test Doubles
Mocks vs Stubs vs Fakes
Stubs return fixed values with no verification. Mocks (CMock) verify call order and arguments. Fakes are working implementations without side effects (e.g., an in-memory EEPROM fake). Use the simplest double that satisfies the test.
| Framework |
Language |
Mocking |
Coverage |
Licence |
| Unity + Ceedling |
C (pure) |
CMock (auto-generated) |
GCOV via Ceedling gcov plugin |
MIT |
| Google Test (gtest) |
C++ (C headers supported) |
gMock built-in |
LCOV / gcovr |
BSD-3-Clause |
| CppUTest |
C / C++ |
CppUMock built-in |
GCOV compatible |
BSD-3-Clause |
| CMSIS-RTOS2 RTX Test Suite |
C |
None (integration focus) |
Target-side only |
Apache 2.0 |
Hardware-in-the-Loop (HIL) Testing
HIL testing connects real firmware running on the target MCU to a test controller (typically a Raspberry Pi, a PC, or a dedicated HIL system like NI VeriStand) that stimulates inputs and verifies outputs. Unlike pure software unit tests, HIL tests exercise real peripherals, real timing, and real electrical behaviour.
A typical low-cost HIL rig for an embedded project uses: target MCU board, a CMSIS-DAP probe for flash/reset control, a Raspberry Pi as test controller connected via UART/SPI/I2C/GPIO, and a Python test runner (pytest + pyserial) orchestrating the test scenarios.
HIL vs Software Tests: HIL tests are valuable but expensive — they require physical hardware, are slower, and are harder to parallelise. Reserve HIL for testing things that cannot be mocked: actual peripheral timing, DMA transfers, interrupt latency measurements, power consumption, and communication protocol compliance.
The key to productive HIL testing is a firmware test mode — a special firmware build that accepts commands over a control UART and reports results. The test controller sends test vectors, the firmware processes them using real hardware, and reports pass/fail over the control channel. This architecture separates test orchestration (on the controller) from test execution (on the target).
Target Side
Firmware Test Mode
Build a special TEST_BUILD firmware that exposes a command UART. Commands trigger specific test scenarios (send SPI frame, measure ADC, toggle GPIO). Results reported in a structured format (JSON or fixed-length binary) for the test controller to parse.
Controller Side
Python pytest HIL Runner
pytest fixtures open the serial port, flash the firmware via pyocd or OpenOCD, send test commands, and assert on responses. Each test function is a pytest case — output is JUnit XML for CI integration. Parameterise over multiple boards for parallel HIL execution.
Code Coverage with GCOV
Code coverage measures which lines, branches, and functions your tests exercise. For embedded firmware, branch coverage (every if taken both ways) is more meaningful than line coverage alone — a single uncovered branch can hide a critical bug path.
GCC supports GCOV natively via --coverage. When tests run on the host with this flag, .gcda files are produced alongside .gcno files. The gcovr tool converts these into HTML reports you can browse or enforce as a CI gate.
# Step 1: Compile test objects with GCOV instrumentation
# (Ceedling handles this automatically when gcov plugin is enabled)
gcc -c ring_buffer.c -o ring_buffer.o \
--coverage \
-fprofile-arcs \
-ftest-coverage \
-O0 \ # Disable optimisation — prevents coverage distortion
-DUNIT_TESTING
gcc -c test_ring_buffer.c -o test_ring_buffer.o --coverage -DUNIT_TESTING
gcc -c unity.c -o unity.o
# Step 2: Link
gcc ring_buffer.o test_ring_buffer.o unity.o -o test_rb --coverage
# Step 3: Run tests (produces .gcda files)
./test_rb
# Step 4: Generate HTML coverage report with gcovr
gcovr \
--root . \
--exclude 'unity\.c' \
--exclude 'test_.*\.c' \
--html \
--html-details \
--output coverage/index.html \
--print-summary \
--fail-under-line 80 \ # Fail CI if line coverage < 80%
--fail-under-branch 70 # Fail CI if branch coverage < 70%
# Step 5: Open report
xdg-open coverage/index.html
# Expected output:
# ------------------------------------------------------------------------------
# TOTAL 345 278 80.6% 68 49 72.1%
# ------------------------------------------------------------------------------
# lines: 80.6% (278 out of 345)
# branches: 72.1% (49 out of 68)
# Alternative: generate Cobertura XML for CI upload
gcovr --xml --output coverage.xml
Coverage Targets: For safety-critical firmware (IEC 61508 SIL 2+, ISO 26262 ASIL B+), 100% modified condition/decision coverage (MC/DC) may be required. For commercial embedded products, 80% line + 70% branch coverage is a practical starting target that finds real gaps without being impractical.
CI/CD for Embedded Projects
Continuous integration for embedded firmware means: every push triggers a build, unit tests run, static analysis executes, coverage is measured, and the results gate the merge. Part 20 covers this in full; here we focus on the GitHub Actions workflow that ties together everything covered in this article.
# .github/workflows/embedded-ci.yml
name: Embedded Firmware CI
on:
push:
branches: [main, develop]
pull_request:
branches: [main]
jobs:
build-and-test:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
submodules: recursive
- name: Install arm-none-eabi toolchain
run: |
sudo apt-get update -qq
sudo apt-get install -y gcc-arm-none-eabi binutils-arm-none-eabi \
cmake ninja-build ruby gcovr
- name: Install Ceedling
run: gem install ceedling
- name: Build firmware (arm-none-eabi)
run: |
mkdir -p build && cd build
cmake -G Ninja \
-DCMAKE_TOOLCHAIN_FILE=../arm-none-eabi.cmake \
-DCMAKE_BUILD_TYPE=Release \
..
ninja
- name: Run Unity unit tests (host GCC)
run: ceedling test:all
working-directory: ${{ github.workspace }}
- name: Generate coverage report
run: ceedling gcov:all utils:gcov
working-directory: ${{ github.workspace }}
- name: Upload coverage to Codecov
uses: codecov/codecov-action@v4
with:
files: build/test/artifacts/gcov/GcovCoverageResults.xml
flags: embedded-unit-tests
fail_ci_if_error: true
- name: Run SonarQube static analysis
uses: SonarSource/sonarcloud-github-action@master
env:
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
with:
args: >
-Dsonar.projectKey=my-firmware
-Dsonar.sources=src
-Dsonar.cfamily.build-wrapper-output=build/bw-output
- name: Upload firmware artifact
uses: actions/upload-artifact@v4
with:
name: firmware-${{ github.sha }}
path: build/firmware.elf
retention-days: 30
Host vs Cross Build: The workflow runs two compilers. The arm-none-eabi compiler builds the production firmware binary (verifying it compiles for the real target). The host GCC compiles and runs the unit tests with coverage. This is the correct separation — unit tests must run fast on the CI host, not on hardware.
Exercises
Exercise 1
Intermediate
Write Unity Tests for CRC32 Targeting 100% Branch Coverage
Implement a software CRC32 function that handles: normal data, empty input (zero length), single-byte input, all-zeros input, and all-0xFF input. Write Unity test cases targeting 100% branch coverage. Use gcovr to measure coverage and confirm you hit the target. Pay particular attention to boundary conditions in the loop structure.
Unity
Branch Coverage
gcovr
CRC32
Exercise 2
Intermediate
CMock SPI Flash Driver — Mock CMSIS-Driver SPI
Generate a CMock mock for the CMSIS-Driver SPI interface (Driver_SPI.h). Write a minimal SPI NOR flash driver (spi_flash.c) that calls USART_Send/USART_Receive through the CMSIS-Driver interface. Create Unity tests using the mock to verify: (a) READ_ID sends the correct 0x9F opcode, (b) PAGE_PROGRAM verifies chip select assertion/deassertion sequence, (c) driver error codes are propagated correctly upward through the flash driver API.
CMock
CMSIS-Driver SPI
Test Doubles
SPI Flash
Exercise 3
Advanced
GitHub Actions CI Pipeline — Build + Test on Every Push
Set up a complete GitHub Actions CI pipeline for an embedded project. The pipeline must: (1) build firmware with arm-none-eabi-gcc and fail if it does not compile, (2) run Ceedling unit tests on host GCC and fail if any test fails, (3) measure branch coverage with gcovr and fail if branch coverage drops below 70%, (4) run cppcheck static analysis and fail on ERROR-severity findings. Bonus: upload the firmware ELF as a build artefact and post a coverage badge to the repository README.
GitHub Actions
Ceedling
gcovr
cppcheck
CI Gates
Test Plan Generator
Use this tool to document your embedded firmware test strategy — test framework selection, unit test coverage targets, HIL setup, and CI pipeline configuration. Download as Word, Excel, PDF, or PPTX for team review or project quality documentation.
Conclusion & Next Steps
In this part we have built a complete embedded testing methodology:
- The embedded testing pyramid — invest most effort in fast, cheap, host-side unit tests; use HIL only for what truly requires hardware.
- Unity provides the assertion macros and test runner structure; Ceedling automates compilation, runner generation, and coverage reporting.
- CMock auto-generates mock implementations of any C header — including CMSIS-Driver interfaces — enabling pure software tests of hardware-dependent modules.
- GCOV + gcovr measures line and branch coverage; set thresholds as CI gates to enforce testing discipline across the team.
- A GitHub Actions CI pipeline that builds firmware, runs unit tests, measures coverage, and runs static analysis catches regressions automatically on every commit.
Next in the Series
In Part 18: Performance Optimization, we shift from correctness to speed — GCC and ARMClang optimisation flags, link-time optimisation, inline assembly with CMSIS intrinsics, instruction cache and TCM usage on M7/M33, DWT cycle counter profiling, and SIMD vectorisation of DSP loops.
Related Articles in This Series
Part 18: Performance Optimization
Compiler flags, LTO, inline assembly, cache and TCM usage on M7/M33, DWT cycle counter profiling, and SIMD intrinsics for maximum throughput.
Read Article
Part 20: Tooling & Workflow — Professional Embedded Development
Complete CI/CD pipeline, MISRA-C static analysis, Doxygen documentation, semantic versioning, and release automation for professional embedded projects.
Read Article
Part 9: Debugging with CMSIS-DAP & CoreSight
SWD/JTAG debugging, HardFault analysis, ITM trace output, and live variable watching — the debugging skills that complement systematic testing.
Read Article