Back to Embedded Systems Hardware Engineering Series

Capstone 8: Autonomous Robot Platform

April 17, 2026 Wasil Zafar 45 min read

Build a mobile robotics platform with dual BLDC motor control, 2D LIDAR for SLAM navigation, IMU sensor fusion, and real-time path planning on an STM32MP1 heterogeneous SoC.

Table of Contents

  1. Platform Specifications
  2. System Architecture
  3. Motor Control Subsystem
  4. LIDAR & Navigation
  5. Robot Planner Tool
  6. Conclusion

Platform Specifications

ParameterSpecificationComponent
Application SoCCortex-A7 650 MHz + Cortex-M4 209 MHzSTM32MP157
Motor ControlDual BLDC, FOC, encoder feedbackDRV8313 × 2
LIDAR360° 2D, 12m range, 8000 samples/sRPLIDAR A1
IMU9-axis (accel + gyro + mag)ICM-20948
Wheel EncodersQuadrature, 1024 PPR × 2Optical encoder
Battery11.1V 5200 mAh LiPo (3S)~60 min runtime
ChassisDifferential drive, 200mm wheelbaseAluminum frame
ConnectivityWiFi + USB OTG + CAN busRemote telemetry

System Architecture

Autonomous Robot Hardware Architecture
flowchart TD
    A["3S LiPo
11.1V 5200mAh"] --> B["Power Distribution
5V + 3.3V + Motor"] B --> C["STM32MP157
A7: Navigation
M4: Motor Control"] C -->|UART| D["RPLIDAR A1
360° LIDAR"] C -->|SPI| E["ICM-20948
9-axis IMU"] C -->|PWM| F["DRV8313 × 2
BLDC Drivers"] F --> G["BLDC Motors
+ Encoders"] C -->|WiFi| H["Remote
Telemetry"] C -->|CAN| I["Expansion
Bus"] style C fill:#3B9797,color:#fff style D fill:#16476A,color:#fff
Heterogeneous SoC: The STM32MP157 runs Linux (Buildroot) on the Cortex-A7 for SLAM, path planning, and WiFi, while the Cortex-M4 handles real-time motor FOC at 10 kHz. Inter-processor communication uses OpenAMP shared memory with RPMSG mailbox for command/telemetry exchange at 1 kHz.

Motor Control Subsystem

/* BLDC Field-Oriented Control (FOC) — runs on Cortex-M4 at 10 kHz
   Differential drive: two independent BLDC motors with encoders */

#include <stdint.h>
#include <math.h>

#define FOC_FREQ_HZ       10000
#define ENCODER_PPR        1024
#define POLE_PAIRS         7
#define WHEEL_DIAMETER_MM  65
#define WHEELBASE_MM       200

typedef struct {
    float id;           /* d-axis current (field) */
    float iq;           /* q-axis current (torque) */
    float angle_elec;   /* Electrical angle (rad) */
    float speed_rpm;    /* Mechanical speed */
    float position_rad; /* Shaft position */
} motor_state_t;

typedef struct {
    float kp_d, ki_d;  /* d-axis PI gains */
    float kp_q, ki_q;  /* q-axis PI gains */
    float kp_speed, ki_speed;  /* Speed loop PI */
    float integral_d, integral_q, integral_speed;
    float max_current;  /* Current limit (A) */
} foc_controller_t;

/* Differential drive kinematics:
 * v_linear  = (v_right + v_left) / 2
 * w_angular = (v_right - v_left) / wheelbase
 *
 * Inverse (velocity command → wheel speeds):
 * v_left  = v_linear - (w_angular * wheelbase / 2)
 * v_right = v_linear + (w_angular * wheelbase / 2)
 */
typedef struct {
    float v_linear;     /* m/s (forward) */
    float w_angular;    /* rad/s (turn rate) */
} twist_cmd_t;

typedef struct {
    float x, y, theta; /* Robot pose in world frame */
} odometry_t;

LIDAR & Navigation

# LIDAR scan processing — obstacle detection and path planning
# Simulates RPLIDAR A1 scan data for 2D occupancy grid

import numpy as np

# RPLIDAR A1 parameters
scan_freq_hz = 5.5       # Scans per second
samples_per_scan = 360   # ~1° angular resolution
max_range_m = 12.0

# Generate synthetic LIDAR scan (rectangular room with obstacle)
angles_deg = np.linspace(0, 359, samples_per_scan)
angles_rad = np.deg2rad(angles_deg)

# Room: 5m × 4m, robot at center (2.5, 2.0)
distances = np.full(samples_per_scan, max_range_m)
for i, angle in enumerate(angles_rad):
    dx = np.cos(angle)
    dy = np.sin(angle)
    # Wall intersections
    if abs(dx) > 0.001:
        t_right = (2.5) / dx if dx > 0 else (-2.5) / dx
        t_left = (-2.5) / dx if dx < 0 else (2.5) / dx
        t_x = min(abs(2.5 / dx), max_range_m)
    else:
        t_x = max_range_m
    if abs(dy) > 0.001:
        t_y = min(abs(2.0 / dy), max_range_m)
    else:
        t_y = max_range_m
    distances[i] = min(t_x, t_y, max_range_m)

# Occupancy grid statistics
valid_points = np.sum(distances < max_range_m)
min_dist = np.min(distances)
mean_dist = np.mean(distances[distances < max_range_m])

print("LIDAR Scan Processing")
print("=" * 50)
print(f"Scan frequency:    {scan_freq_hz} Hz")
print(f"Points per scan:   {samples_per_scan}")
print(f"Valid points:      {valid_points}")
print(f"Min distance:      {min_dist:.2f} m")
print(f"Mean distance:     {mean_dist:.2f} m")
print(f"Max range:         {max_range_m} m")
print(f"\nOccupancy grid:    200 × 200 cells")
print(f"Resolution:        25 mm/cell (5m × 5m)")
print(f"Update rate:       {scan_freq_hz} Hz")
print(f"Path planner:      A* with 8-connected grid")

Robot Configuration Planner

Robot Platform Planner

Plan your robot's hardware configuration. Download as Word, Excel, or PDF.

Draft auto-saved

Conclusion

This capstone brings together motor control (BLDC FOC on Cortex-M4), navigation (LIDAR SLAM on Cortex-A7), sensor fusion (IMU + encoders), and system integration on a heterogeneous SoC. The platform demonstrates the full embedded systems stack from low-level PWM generation to high-level path planning algorithms.

Next Up: Professional Templates

In Template 1: STM32 Custom PCB Walkthrough, we’ll provide a complete step-by-step template for designing, manufacturing, and assembling a custom STM32-based PCB.