Autonomous Robotics Engineer

Building Autonomous Racing Intelligence.

A fully self-built autonomous racing car featuring stereo visual SLAM, multi-sensor EKF fusion (wheel encoder, IMU, steering input), real-time path planning, and Ackermann steering control — running entirely on a Raspberry Pi 5.

2
SLAM Systems
3+
Sensors Fused
RPi 5
Compute
Ackermann
Steering

System Architecture

How the Robot Thinks and Moves

A tightly integrated stack — from raw sensor data to wheel actuation — running entirely on commodity hardware.

Perception

  • ELP Global Shutter Stereo Camera
  • STM LSM6DSOX IMU
  • Wheel Encoder
  • Steering Angle Input
  • SGBM Stereo Depth
  • ORB-SLAM3 / RTAB-Map
  • Kalibr Calibration

Planning

  • RRT / A* Global Planner
  • Gap Following
  • Pure Pursuit
  • Racing Line Optimization

Control

  • Arduino Nano RP2040 Connect
  • Ackermann Steering (Servo)
  • Brushless Motor + ESC
PerceptionPlanningControl(all over ROS 2)

Mapping and Localization

Perception Stack

Two independent SLAM systems run on the robot, each suited to different operating conditions and levels of sensor fusion.

Camera-IMU Calibration

Before any SLAM algorithm runs, the stereo camera and IMU must be spatially and temporally calibrated. Kalibr estimates the extrinsic transform between sensors and aligns their timestamps — a prerequisite for accurate visual-inertial odometry.

Camera-IMU Calibration (Kalibr)

Spatial and temporal calibration of the ELP stereo camera with the STM LSM6DSOX IMU using Kalibr. Required for accurate visual-inertial odometry.

ORB-SLAM3

A stereo visual-inertial SLAM system that relies purely on the camera and IMU. ORB feature extraction produces a sparse 3D map with real-time loop closure. The camera and IMU are tightly calibrated with Kalibr to enable accurate visual-inertial odometry.

ORB-SLAM3 Corridor Mapping

Real-time stereo visual SLAM in an indoor corridor. Sparse point cloud with loop closure and relocalization.

ORB-SLAM3 Map Output

ORB-SLAM3 Map Output

Zoomed-out view of the sparse 3D map built by ORB-SLAM3, showing the reconstructed point cloud and navigable corridors.

RTAB-Map

Unlike ORB-SLAM3, RTAB-Map fuses a richer set of proprioceptive sensors before building the map. Wheel encoder odometry, IMU measurements, and the commanded steering angle are fed into an Extended Kalman Filter (EKF) node, which produces a fused odometry estimate. This is then combined with the camera's visual odometry inside RTAB-Map, giving the system a far more robust pose prior — especially in low-texture areas where pure visual methods drift.

Sensor Fusion Pipeline

Wheel Encoder
Velocity and distance traveled
STM LSM6DSOX IMU
Angular rate and linear acceleration
Steering Input
Commanded heading angle from Arduino
EKF Node
robot_localization
Fused odometry
Visual Odometry
Stereo camera
RTAB-Map
Dense map + loop closure

RTAB-Map Dense Reconstruction

Dense RGB-D SLAM generating a rich occupancy grid for navigation. Appearance-based loop closure handles revisited environments.

RTAB-Map Output Visualization

RTAB-Map Output Visualization

2D occupancy grid output from RTAB-Map — the primary map representation consumed by the path planner.

Stereo Depth Estimation

The stereo camera pair feeds a Semi-Global Block Matching (SGBM) pipeline running at 15 FPS entirely on the Raspberry Pi 5 CPU. The resulting dense depth map is published as a ROS 2 topic, leaving enough CPU headroom for SLAM, planning, and other algorithms to consume it simultaneously — no dedicated GPU needed.

SGBM Stereo Depth

Semi-Global Block Matching running at 15 FPS on the Raspberry Pi 5 CPU, leaving sufficient headroom for other algorithms to consume the depth map concurrently — no GPU required.

Path Planning

Where to Go and How to Get There

A layered planning stack — global route computation paired with reactive local obstacle avoidance — for safe, efficient trajectories.

RRT
Rapidly-exploring Random Trees

Probabilistic sampling-based planner that efficiently explores high-dimensional configuration spaces to find feasible paths around obstacles.

A*
A-Star Search

Optimal grid-based path search using an admissible heuristic. Used for global route planning on the occupancy map generated by RTAB-Map.

Gap Following
Follow-the-Gap Method

Reactive local planner that identifies the widest obstacle-free gap in depth data and steers toward it — ideal for tight corridors at speed.

Pure Pursuit
Pure Pursuit Controller

Geometric path tracker that computes the steering angle required to intercept a lookahead point on the planned trajectory.

Trajectory Generation

Raw waypoints are smoothed into kinematically feasible trajectories respecting the Ackermann geometry. Velocity profiles are computed to maximize speed within safe lateral acceleration limits.

Racing Line Optimization

For time-trial scenarios, an optimization pass computes the minimum-curvature racing line through waypoints — enabling the robot to carry maximum speed through corners.

Low-Level Execution

Control Architecture

Translating planned trajectories into precise wheel commands at millisecond timescales.

Ackermann Steering Model

The robot uses rear-wheel-drive Ackermann geometry, ensuring each wheel follows its own arc during a turn. This eliminates tire scrub and maintains stability at speed.

PID Controller

Proportional-Integral-Derivative control loops regulate both longitudinal speed and lateral steering angle, continuously minimizing tracking error against the planned trajectory.

Model Predictive Control

MPC solves a finite-horizon optimization at each timestep, predicting future states and computing control inputs that minimize a cost function over the prediction window.

Arduino RP2040 Low-Level Interface

The Arduino Nano RP2040 Connect translates high-level velocity and steering commands from ROS 2 into PWM signals for the brushless motor ESC and steering servo, providing deterministic real-time actuation.

Control Loop

Planner
Trajectory
Controller
PID / MPC
Arduino
PWM Signals
Actuators
Brushless Motor + Servo

Odometry and state feedback loop back to the controller

The Platform

The Robot

A custom-built autonomous racing platform designed from the ground up.

Hardware Components

ELP Stereo Camera
STM LSM6DSOX IMU
Raspberry Pi 5
Arduino Nano RP2040

Tools and Technologies

Built With Precision

C++ — Dominant Codebase Language

C++ is the primary language across the robot's software stack. Its zero-overhead abstractions, deterministic memory management, and direct hardware access make it the right choice for real-time ROS 2 nodes where every microsecond of latency matters.

FreeRTOS on Arduino — Lowest Jitter, Highest Determinism

The Arduino Nano RP2040 Connect runs FreeRTOS to deterministically sample the IMU, read servo position, and capture motor PWM feedback — then stream that data to the Raspberry Pi at a high, consistent rate. Preemptive task scheduling and priority-based execution eliminate the jitter inherent in bare loop() polling, ensuring sensor readings arrive at the Pi with minimal and predictable latency.

Platform Comparison

How It Compares

A detailed comparison of sensor capability, algorithm coverage, real-time engineering, and industry readiness across the leading small-scale autonomous racing platforms.

F1TENTH
1/10 scale research
Price~$4,000
Primary Sensor2D LiDAR (RPLiDAR)
LocalizationLiDAR SLAM
Mapping2D occupancy grid
Depth EstimationNo (LiDAR only, no stereo)
IMU FusionBasic (VESC IMU)
Racing AlgorithmsGap Follow, Pure Pursuit, MPC
Behavioral CloningNot covered
Real-time EngineeringPartial (ROS 1 often)
Camera CalibrationBasic
ROS VersionROS 1 or ROS 2
Scale and Speed1/10 scale, fast
Industry ReadinessHigh (but inaccessible)
Donkey Car
DIY beginner RC
Price~$250 – $400
Primary SensorSingle monocular camera
LocalizationNone (end-to-end only)
MappingNone
Depth EstimationNone
IMU FusionNone
Racing AlgorithmsBehavioral cloning only
Behavioral CloningBasic TF/Keras
Real-time EngineeringNo real-time concerns
Camera CalibrationNone
ROS VersionNo ROS
Scale and Speed1/10 scale, slow
Industry ReadinessLow
RoboRacer
1/10 competition
Price~$1,400
Primary SensorCamera or LiDAR
LocalizationBasic odometry or simulation
MappingNone or simulation
Depth EstimationNone
IMU FusionNone
Racing AlgorithmsUsually one algorithm
Behavioral CloningRarely
Real-time EngineeringRarely taught
Camera CalibrationNone
ROS VersionVaries
Scale and SpeedSlow / simulation
Industry ReadinessLow
This Platform
Custom Ackermann
Price~$500
Primary SensorStereo camera + IMU (full VIO)
LocalizationStereo VIO + RTAB-Map 3D SLAM
MappingFull 3D dense mapping
Depth EstimationStereo depth + 3D reprojection
IMU FusionFull camera-IMU calibration (Kalibr) + EKF sensor fusion
Racing AlgorithmsGap Follow, Pure Pursuit, MPC + behavioral cloning
Behavioral CloningProduction imitation learning on real hardware
Real-time EngineeringReal-time ROS 2, timestamp correction, hardware-aware coding
Camera CalibrationFull stereo + camera-IMU calibration with Kalibr
ROS VersionROS 2 (production standard)
Scale and Speed1/12 scale, very high speed
Industry ReadinessHigh — same stack used in industry

Prices are approximate and vary by region and configuration.

Get In Touch

Contact

Interested in collaboration, research, or just talking robotics? Reach out.