Back to Sensors & Actuators Series

Part 13: Real-World Projects

July 14, 2025 Wasil Zafar 55 min read

Eight hands-on projects from beginner to expert — weather station, line-following robot, smart greenhouse, home security system, robotic arm, industrial monitoring, drone flight controller, and autonomous navigation.

Table of Contents

  1. Beginner Projects
  2. Intermediate Projects
  3. Advanced Projects
  4. Expert Projects
  5. Conclusion & Next Steps

Beginner Projects

Project 1: IoT Weather Station

Weather Station Bill of Materials

ComponentPurposeInterface
ESP32MCU + WiFi
BME280Temp, humidity, pressureI2C (0x76)
BH1750Light intensityI2C (0x23)
Rain sensorRain detectionAnalog
Wind speed (anemometer)Wind measurementPulse count
SSD1306 OLEDLocal displayI2C (0x3C)
# ESP32 weather station with MQTT reporting (MicroPython)
from machine import Pin, I2C, ADC
import time
import json
from umqtt.simple import MQTTClient
from bme280 import BME280

# Initialize I2C bus
i2c = I2C(0, scl=Pin(22), sda=Pin(21), freq=400000)

# Initialize sensors
bme = BME280(i2c=i2c)
rain_adc = ADC(Pin(34))
rain_adc.atten(ADC.ATTN_11DB)

# MQTT configuration
BROKER = "192.168.1.100"
CLIENT_ID = "weather-station-01"
TOPIC = "weather/outdoor/readings"

client = MQTTClient(CLIENT_ID, BROKER)
client.connect()

while True:
    temp, pressure, humidity = bme.values
    rain_raw = rain_adc.read()
    is_raining = rain_raw < 2000  # Wet = lower resistance

    payload = json.dumps({
        "temperature_c": float(temp.replace('C', '')),
        "pressure_hpa": float(pressure.replace('hPa', '')),
        "humidity_pct": float(humidity.replace('%', '')),
        "rain_detected": is_raining,
        "rain_raw": rain_raw
    })

    client.publish(TOPIC, payload)
    print(f"Published: {payload}")
    time.sleep(60)

Project 2: Line-Following Robot

Hardware:
  • MCU: STM32F103 (Blue Pill) or Arduino Uno
  • Sensors: 5-channel IR reflectance array (TCRT5000 × 5)
  • Motors: 2 × N20 gear motors + L298N H-bridge driver
  • Power: 7.4V LiPo battery + 5V regulator
  • Chassis: 2-wheel differential drive with caster wheel
// Line follower with PID control (STM32)
#include "stm32f1xx.h"

#define BASE_SPEED 150  // PWM duty (0-255)
#define KP 25.0f
#define KI 0.5f
#define KD 15.0f

typedef struct {
    uint16_t sensors[5];  // IR array readings
    int16_t  position;    // Weighted position (-2000 to +2000)
} SensorArray;

SensorArray read_line_sensors(void) {
    SensorArray arr;
    int32_t weighted_sum = 0;
    int32_t total = 0;
    int16_t weights[] = {-2000, -1000, 0, 1000, 2000};

    for (int i = 0; i < 5; i++) {
        arr.sensors[i] = adc_read_channel(i);
        weighted_sum += (int32_t)arr.sensors[i] * weights[i];
        total += arr.sensors[i];
    }

    arr.position = (total > 500) ? (weighted_sum / total) : 0;
    return arr;
}

void drive_motors(int16_t left_speed, int16_t right_speed) {
    // Clamp to valid PWM range
    left_speed  = (left_speed  > 255) ? 255 : (left_speed  < -255) ? -255 : left_speed;
    right_speed = (right_speed > 255) ? 255 : (right_speed < -255) ? -255 : right_speed;

    motor_set(MOTOR_LEFT,  left_speed);
    motor_set(MOTOR_RIGHT, right_speed);
}

int main(void) {
    SystemInit();
    motor_init();
    adc_init();

    float integral = 0, prev_error = 0;

    while (1) {
        SensorArray arr = read_line_sensors();
        float error = (float)arr.position;

        integral += error;
        float derivative = error - prev_error;
        float correction = KP * error + KI * integral + KD * derivative;
        prev_error = error;

        drive_motors(BASE_SPEED + (int16_t)correction,
                     BASE_SPEED - (int16_t)correction);
    }
}

Intermediate Projects

Project 3: Smart Greenhouse Controller

Greenhouse Sensor & Actuator Network

ComponentFunctionControl Logic
BME280Air temp/humidityTrigger fan if temp > 30°C
Soil moisture (capacitive)Soil water levelTrigger pump if moisture < 40%
BH1750Light intensityActivate grow lights if < 5000 lux
MH-Z19BCO2 concentrationOpen vent if CO2 > 1500 ppm
Water pump (12V)IrrigationMOSFET-switched, duty-cycled
Exhaust fan (12V)VentilationPWM speed based on temperature delta
ServoVent windowAngle proportional to CO2 level
LED grow lightSupplemental lightPWM dimming + timer schedule

Project 4: Home Security System

Security System Architecture:
  • PIR motion sensors: HC-SR501 at each entry point (interrupt-driven wake from sleep)
  • Magnetic door/window contacts: Reed switches on all openings
  • Ultrasonic ranging: HC-SR04 for volumetric detection in hallways
  • Piezo buzzer: 105 dB alarm siren (PWM-driven at resonant frequency)
  • ESP32 hub: Aggregates sensor data, sends push notifications via WiFi
  • RFID (RC522): Arm/disarm with authorized cards
  • Battery backup: 18650 Li-ion with charge controller for power failure resilience

Advanced Projects

Project 5: 6-DOF Robotic Arm

# Inverse kinematics for 3-link planar robot arm
import numpy as np
import matplotlib.pyplot as plt

# Arm link lengths (mm)
L1 = 150  # Shoulder to elbow
L2 = 120  # Elbow to wrist
L3 = 80   # Wrist to end-effector

def inverse_kinematics_2link(x, y, l1, l2):
    """Calculate joint angles for 2-link arm to reach (x, y)"""
    dist = np.sqrt(x**2 + y**2)
    if dist > l1 + l2:
        raise ValueError("Target out of reach")

    # Elbow angle (law of cosines)
    cos_q2 = (x**2 + y**2 - l1**2 - l2**2) / (2 * l1 * l2)
    cos_q2 = np.clip(cos_q2, -1, 1)
    q2 = np.arccos(cos_q2)  # Elbow-up solution

    # Shoulder angle
    q1 = np.arctan2(y, x) - np.arctan2(l2 * np.sin(q2), l1 + l2 * np.cos(q2))

    return np.degrees(q1), np.degrees(q2)

# Calculate angles for target position
target_x, target_y = 200, 150
theta1, theta2 = inverse_kinematics_2link(target_x, target_y, L1, L2)
print(f"Shoulder: {theta1:.1f}°, Elbow: {theta2:.1f}°")

# Visualize arm configuration
j1_x = L1 * np.cos(np.radians(theta1))
j1_y = L1 * np.sin(np.radians(theta1))
j2_x = j1_x + L2 * np.cos(np.radians(theta1 + theta2))
j2_y = j1_y + L2 * np.sin(np.radians(theta1 + theta2))

plt.figure(figsize=(8, 8))
plt.plot([0, j1_x, j2_x], [0, j1_y, j2_y], 'b-o', linewidth=3)
plt.plot(target_x, target_y, 'rx', markersize=15, markeredgewidth=3)
plt.axis('equal')
plt.grid(True)
plt.title(f'Robotic Arm — Target ({target_x}, {target_y})')
plt.xlabel('X (mm)')
plt.ylabel('Y (mm)')
plt.show()

Project 6: Industrial Vibration Monitor

Predictive Maintenance System:
  • ADXL345 accelerometer: 3-axis vibration at 3200 Hz sampling rate
  • STM32F4 + DMA: Continuous ADC capture, FFT analysis in firmware
  • Baseline learning: Record "healthy" vibration spectrum during commissioning
  • Anomaly detection: Compare live FFT peaks against baseline. Alert on bearing faults (characteristic frequencies), imbalance, or misalignment
  • LoRaWAN transmission: Send spectral features (not raw data) every 15 minutes
  • Dashboard: Node-RED + InfluxDB + Grafana for trend visualization

Expert Projects

Project 7: Quadcopter Flight Controller

Flight Controller Sensor Suite

SensorPurposeUpdate Rate
MPU6050 (IMU)6-axis gyro + accelerometer1 kHz
HMC5883L (Magnetometer)Compass heading75 Hz
BMP280 (Barometer)Altitude estimation100 Hz
GPS (NEO-6M)Position, velocity5 Hz
VL53L0X (ToF)Ground distance for landing50 Hz
Current sensor (ACS712)Battery monitoring100 Hz
// Simplified quadcopter motor mixing (after PID loop)
#include <stdint.h>

typedef struct {
    float throttle;  // 0.0 to 1.0 (base thrust)
    float roll;      // -1.0 to 1.0 (roll correction)
    float pitch;     // -1.0 to 1.0 (pitch correction)
    float yaw;       // -1.0 to 1.0 (yaw correction)
} FlightCommand;

typedef struct {
    uint16_t motor[4];  // PWM values (1000-2000 µs)
} MotorOutput;

// Motor layout (X configuration):
//   M1(CW)  M2(CCW)
//       \  /
//        \/
//        /\
//       /  \
//   M3(CCW) M4(CW)

MotorOutput motor_mix(FlightCommand cmd) {
    MotorOutput out;
    float m[4];

    // Mixer: throttle ± roll ± pitch ± yaw
    m[0] = cmd.throttle - cmd.roll + cmd.pitch + cmd.yaw;  // Front-Left (CW)
    m[1] = cmd.throttle + cmd.roll + cmd.pitch - cmd.yaw;  // Front-Right (CCW)
    m[2] = cmd.throttle + cmd.roll - cmd.pitch + cmd.yaw;  // Rear-Right (CW)
    m[3] = cmd.throttle - cmd.roll - cmd.pitch - cmd.yaw;  // Rear-Left (CCW)

    // Convert to PWM (1000-2000 µs) and clamp
    for (int i = 0; i < 4; i++) {
        if (m[i] < 0.0f) m[i] = 0.0f;
        if (m[i] > 1.0f) m[i] = 1.0f;
        out.motor[i] = 1000 + (uint16_t)(m[i] * 1000.0f);
    }

    return out;
}

Project 8: Autonomous Navigation Robot

Autonomous Navigation Stack:
  • Perception: RPLiDAR A1 (360° 2D scan), Intel RealSense D435 (depth camera), wheel encoders
  • Localization: Extended Kalman Filter fusing odometry + IMU + LiDAR scan matching
  • Mapping: SLAM (Simultaneous Localization and Mapping) using Hector SLAM or Cartographer
  • Path Planning: A* or Dijkstra on occupancy grid, followed by DWA (Dynamic Window Approach) for local obstacle avoidance
  • Control: Raspberry Pi 4 for high-level planning, STM32 for real-time motor control
  • Communication: ROS 2 (Robot Operating System) for inter-process messaging

Conclusion & Next Steps

Building real-world projects is the best way to internalize embedded systems concepts. Start with the weather station to master sensor interfacing, progress through the line follower and greenhouse to learn control systems, and challenge yourself with the drone or autonomous robot to apply sensor fusion, real-time control, and system architecture principles.

Project Progression Advice:
  • Build each project on breadboard first, then move to custom PCB
  • Start with polling-based code, then refactor to interrupt / RTOS-driven
  • Add logging and data export early to debug and analyze performance
  • Document your build process — portfolio projects accelerate your career
  • Share on GitHub with README, schematic, and demo video

In Part 14, we explore Career Pathways in Embedded Systems — roles, skills, certifications, and roadmaps for embedded systems, robotics, IoT, and hardware design careers.